Mar 08 00:19:16.110609 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 00:19:16.686834 master-0 kubenswrapper[4059]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:19:16.688810 master-0 kubenswrapper[4059]: I0308 00:19:16.687776 4059 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691106 4059 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691124 4059 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691129 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691133 4059 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691137 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691141 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691145 4059 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691149 4059 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691153 4059 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691157 4059 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691161 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691166 4059 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691170 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691175 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:19:16.691146 master-0 kubenswrapper[4059]: W0308 00:19:16.691179 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691183 4059 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691188 4059 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691192 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691195 4059 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691212 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691217 4059 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691222 4059 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691226 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691231 4059 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691235 4059 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691239 4059 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691249 4059 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691253 4059 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691257 4059 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691260 4059 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691264 4059 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691268 4059 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691271 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691275 4059 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:19:16.691927 master-0 kubenswrapper[4059]: W0308 00:19:16.691278 4059 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691282 4059 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691286 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691289 4059 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691293 4059 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691296 4059 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691300 4059 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691305 4059 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691310 4059 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691314 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691318 4059 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691323 4059 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691327 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691331 4059 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691336 4059 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691340 4059 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691344 4059 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691348 4059 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691351 4059 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:19:16.692953 master-0 kubenswrapper[4059]: W0308 00:19:16.691354 4059 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691358 4059 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691361 4059 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691365 4059 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691368 4059 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691371 4059 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691375 4059 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691378 4059 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691382 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691385 4059 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691389 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691394 4059 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691399 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691403 4059 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691407 4059 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691412 4059 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691416 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691420 4059 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: W0308 00:19:16.691426 4059 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: I0308 00:19:16.692096 4059 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 00:19:16.693909 master-0 kubenswrapper[4059]: I0308 00:19:16.692111 4059 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692120 4059 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692129 4059 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692135 4059 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692139 4059 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692145 4059 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692150 4059 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692155 4059 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692159 4059 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692164 4059 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692168 4059 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692173 4059 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692177 4059 flags.go:64] FLAG: --cgroup-root="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692181 4059 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692185 4059 flags.go:64] FLAG: --client-ca-file="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692189 4059 flags.go:64] FLAG: --cloud-config="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692193 4059 flags.go:64] FLAG: --cloud-provider="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692209 4059 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692215 4059 flags.go:64] FLAG: --cluster-domain="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692219 4059 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692223 4059 flags.go:64] FLAG: --config-dir="" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692227 4059 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692231 4059 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692236 4059 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 00:19:16.694950 master-0 kubenswrapper[4059]: I0308 00:19:16.692241 4059 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692245 4059 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692249 4059 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692253 4059 flags.go:64] FLAG: --contention-profiling="false" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692257 4059 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692261 4059 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692267 4059 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692271 4059 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692276 4059 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692281 4059 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692285 4059 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692289 4059 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692293 4059 flags.go:64] FLAG: --enable-server="true" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692297 4059 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692304 4059 flags.go:64] FLAG: --event-burst="100" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692308 4059 flags.go:64] FLAG: --event-qps="50" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692312 4059 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692317 4059 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692321 4059 flags.go:64] FLAG: --eviction-hard="" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692326 4059 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692330 4059 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692334 4059 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692338 4059 flags.go:64] FLAG: --eviction-soft="" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692342 4059 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692347 4059 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 00:19:16.696495 master-0 kubenswrapper[4059]: I0308 00:19:16.692351 4059 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692355 4059 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692358 4059 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692362 4059 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692367 4059 flags.go:64] FLAG: --feature-gates="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692378 4059 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692382 4059 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692386 4059 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692390 4059 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692395 4059 flags.go:64] FLAG: --healthz-port="10248" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692400 4059 flags.go:64] FLAG: --help="false" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692404 4059 flags.go:64] FLAG: --hostname-override="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692408 4059 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692412 4059 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692416 4059 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692420 4059 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692424 4059 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692428 4059 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692432 4059 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692436 4059 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692440 4059 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692444 4059 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692449 4059 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692452 4059 flags.go:64] FLAG: --kube-reserved="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692457 4059 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692460 4059 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 00:19:16.697844 master-0 kubenswrapper[4059]: I0308 00:19:16.692465 4059 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692469 4059 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692474 4059 flags.go:64] FLAG: --lock-file="" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692478 4059 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692482 4059 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692486 4059 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692492 4059 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692496 4059 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692500 4059 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692504 4059 flags.go:64] FLAG: --logging-format="text" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692508 4059 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692512 4059 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692516 4059 flags.go:64] FLAG: --manifest-url="" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692520 4059 flags.go:64] FLAG: --manifest-url-header="" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692526 4059 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692530 4059 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692536 4059 flags.go:64] FLAG: --max-pods="110" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692542 4059 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692547 4059 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692552 4059 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692557 4059 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692562 4059 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692566 4059 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692570 4059 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 00:19:16.699003 master-0 kubenswrapper[4059]: I0308 00:19:16.692580 4059 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692584 4059 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692588 4059 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692592 4059 flags.go:64] FLAG: --pod-cidr="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692602 4059 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692610 4059 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692614 4059 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692618 4059 flags.go:64] FLAG: --pods-per-core="0" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692622 4059 flags.go:64] FLAG: --port="10250" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692626 4059 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692630 4059 flags.go:64] FLAG: --provider-id="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692635 4059 flags.go:64] FLAG: --qos-reserved="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692639 4059 flags.go:64] FLAG: --read-only-port="10255" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692643 4059 flags.go:64] FLAG: --register-node="true" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692648 4059 flags.go:64] FLAG: --register-schedulable="true" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692652 4059 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692711 4059 flags.go:64] FLAG: --registry-burst="10" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692716 4059 flags.go:64] FLAG: --registry-qps="5" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692720 4059 flags.go:64] FLAG: --reserved-cpus="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692724 4059 flags.go:64] FLAG: --reserved-memory="" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692729 4059 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692733 4059 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692737 4059 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692742 4059 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692745 4059 flags.go:64] FLAG: --runonce="false" Mar 08 00:19:16.700127 master-0 kubenswrapper[4059]: I0308 00:19:16.692749 4059 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692754 4059 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692758 4059 flags.go:64] FLAG: --seccomp-default="false" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692762 4059 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692766 4059 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692771 4059 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692775 4059 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692779 4059 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692784 4059 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692788 4059 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692792 4059 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692797 4059 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692803 4059 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692807 4059 flags.go:64] FLAG: --system-cgroups="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692811 4059 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692819 4059 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692823 4059 flags.go:64] FLAG: --tls-cert-file="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692827 4059 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692832 4059 flags.go:64] FLAG: --tls-min-version="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692836 4059 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692840 4059 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692844 4059 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692849 4059 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692853 4059 flags.go:64] FLAG: --v="2" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692858 4059 flags.go:64] FLAG: --version="false" Mar 08 00:19:16.701348 master-0 kubenswrapper[4059]: I0308 00:19:16.692864 4059 flags.go:64] FLAG: --vmodule="" Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: I0308 00:19:16.692868 4059 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: I0308 00:19:16.692873 4059 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692975 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692981 4059 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692985 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692989 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692994 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.692999 4059 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693003 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693007 4059 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693011 4059 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693015 4059 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693019 4059 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693023 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693026 4059 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693030 4059 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693034 4059 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693038 4059 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693043 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:19:16.702459 master-0 kubenswrapper[4059]: W0308 00:19:16.693047 4059 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693050 4059 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693054 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693057 4059 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693061 4059 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693064 4059 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693068 4059 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693072 4059 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693075 4059 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693079 4059 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693084 4059 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693087 4059 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693091 4059 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693095 4059 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693098 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693102 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693106 4059 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693109 4059 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693113 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693116 4059 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:19:16.703579 master-0 kubenswrapper[4059]: W0308 00:19:16.693120 4059 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693123 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693127 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693130 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693134 4059 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693137 4059 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693141 4059 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693144 4059 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693148 4059 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693152 4059 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693157 4059 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693162 4059 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693166 4059 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693170 4059 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693174 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693177 4059 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693181 4059 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693185 4059 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693189 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693192 4059 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:19:16.704601 master-0 kubenswrapper[4059]: W0308 00:19:16.693196 4059 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693212 4059 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693218 4059 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693221 4059 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693226 4059 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693231 4059 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693235 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693239 4059 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693242 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693246 4059 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693251 4059 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693255 4059 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693259 4059 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693263 4059 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:19:16.705616 master-0 kubenswrapper[4059]: W0308 00:19:16.693267 4059 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: I0308 00:19:16.693281 4059 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: I0308 00:19:16.703195 4059 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: I0308 00:19:16.703283 4059 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703416 4059 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703429 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703440 4059 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703450 4059 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703458 4059 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703469 4059 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703484 4059 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703494 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703504 4059 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703513 4059 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703521 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:19:16.706374 master-0 kubenswrapper[4059]: W0308 00:19:16.703530 4059 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703538 4059 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703548 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703555 4059 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703566 4059 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703577 4059 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703589 4059 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703599 4059 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703609 4059 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703620 4059 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703630 4059 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703639 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703647 4059 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703656 4059 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703664 4059 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703675 4059 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703685 4059 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703694 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703704 4059 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:19:16.707051 master-0 kubenswrapper[4059]: W0308 00:19:16.703714 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703722 4059 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703731 4059 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703738 4059 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703749 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703757 4059 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703765 4059 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703773 4059 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703781 4059 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703789 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703797 4059 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703805 4059 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703812 4059 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703820 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703828 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703836 4059 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703844 4059 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703853 4059 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703861 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703870 4059 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:19:16.708275 master-0 kubenswrapper[4059]: W0308 00:19:16.703878 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703886 4059 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703895 4059 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703904 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703912 4059 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703920 4059 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703928 4059 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703936 4059 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703943 4059 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703951 4059 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703960 4059 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703967 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703978 4059 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703986 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.703994 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704002 4059 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704010 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704017 4059 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704025 4059 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704033 4059 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:19:16.709778 master-0 kubenswrapper[4059]: W0308 00:19:16.704042 4059 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704050 4059 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: I0308 00:19:16.704063 4059 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704405 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704426 4059 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704440 4059 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704448 4059 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704458 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704467 4059 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704476 4059 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704485 4059 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704496 4059 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704506 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704515 4059 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:19:16.711050 master-0 kubenswrapper[4059]: W0308 00:19:16.704523 4059 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704531 4059 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704540 4059 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704548 4059 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704556 4059 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704565 4059 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704574 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704582 4059 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704590 4059 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704598 4059 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704606 4059 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704615 4059 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704624 4059 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704632 4059 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704642 4059 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704652 4059 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704660 4059 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704692 4059 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704701 4059 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:19:16.711985 master-0 kubenswrapper[4059]: W0308 00:19:16.704709 4059 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704720 4059 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704730 4059 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704740 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704751 4059 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704760 4059 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704768 4059 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704777 4059 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704785 4059 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704794 4059 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704803 4059 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704811 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704819 4059 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704827 4059 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704836 4059 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704843 4059 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704851 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704859 4059 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704867 4059 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704875 4059 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:19:16.712991 master-0 kubenswrapper[4059]: W0308 00:19:16.704883 4059 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704891 4059 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704898 4059 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704907 4059 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704915 4059 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704924 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704931 4059 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704940 4059 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704947 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704955 4059 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704964 4059 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704972 4059 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704979 4059 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704987 4059 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.704995 4059 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705003 4059 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705011 4059 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705019 4059 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705026 4059 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705036 4059 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:19:16.713939 master-0 kubenswrapper[4059]: W0308 00:19:16.705044 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: W0308 00:19:16.705061 4059 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.705076 4059 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.706331 4059 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.710704 4059 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.712014 4059 server.go:997] "Starting client certificate rotation" Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.712051 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 00:19:16.714913 master-0 kubenswrapper[4059]: I0308 00:19:16.712382 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:19:16.741192 master-0 kubenswrapper[4059]: I0308 00:19:16.741104 4059 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:19:16.744842 master-0 kubenswrapper[4059]: I0308 00:19:16.744753 4059 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:19:16.919407 master-0 kubenswrapper[4059]: E0308 00:19:16.919329 4059 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:16.937554 master-0 kubenswrapper[4059]: I0308 00:19:16.937268 4059 log.go:25] "Validated CRI v1 runtime API" Mar 08 00:19:16.942070 master-0 kubenswrapper[4059]: I0308 00:19:16.941997 4059 log.go:25] "Validated CRI v1 image API" Mar 08 00:19:16.946035 master-0 kubenswrapper[4059]: I0308 00:19:16.946010 4059 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 00:19:16.950047 master-0 kubenswrapper[4059]: I0308 00:19:16.949989 4059 fs.go:135] Filesystem UUIDs: map[39fc8acc-7a4c-4a2a-a305-ed25849d8805:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 00:19:16.950047 master-0 kubenswrapper[4059]: I0308 00:19:16.950034 4059 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 08 00:19:16.964471 master-0 kubenswrapper[4059]: I0308 00:19:16.964179 4059 manager.go:217] Machine: {Timestamp:2026-03-08 00:19:16.962407439 +0000 UTC m=+0.674006981 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3fb2a1568fb24853b5e4190e9ed87031 SystemUUID:3fb2a156-8fb2-4853-b5e4-190e9ed87031 BootID:ae637101-d6c8-4837-b1bb-2909ed5c1c9d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:0f:fb:26 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:73:5d:56 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:ee:64:ec:05:bf:ed Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 00:19:16.964471 master-0 kubenswrapper[4059]: I0308 00:19:16.964436 4059 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 00:19:16.964654 master-0 kubenswrapper[4059]: I0308 00:19:16.964557 4059 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 00:19:16.965498 master-0 kubenswrapper[4059]: I0308 00:19:16.965475 4059 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 00:19:16.965673 master-0 kubenswrapper[4059]: I0308 00:19:16.965640 4059 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 00:19:16.965901 master-0 kubenswrapper[4059]: I0308 00:19:16.965670 4059 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 00:19:16.965958 master-0 kubenswrapper[4059]: I0308 00:19:16.965914 4059 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 00:19:16.965958 master-0 kubenswrapper[4059]: I0308 00:19:16.965924 4059 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 00:19:16.966005 master-0 kubenswrapper[4059]: I0308 00:19:16.965990 4059 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:19:16.966028 master-0 kubenswrapper[4059]: I0308 00:19:16.966012 4059 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:19:16.966138 master-0 kubenswrapper[4059]: I0308 00:19:16.966119 4059 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:19:16.966299 master-0 kubenswrapper[4059]: I0308 00:19:16.966282 4059 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 00:19:16.970282 master-0 kubenswrapper[4059]: I0308 00:19:16.970261 4059 kubelet.go:418] "Attempting to sync node with API server" Mar 08 00:19:16.970282 master-0 kubenswrapper[4059]: I0308 00:19:16.970278 4059 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 00:19:16.970355 master-0 kubenswrapper[4059]: I0308 00:19:16.970293 4059 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 00:19:16.970355 master-0 kubenswrapper[4059]: I0308 00:19:16.970303 4059 kubelet.go:324] "Adding apiserver pod source" Mar 08 00:19:16.970355 master-0 kubenswrapper[4059]: I0308 00:19:16.970313 4059 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 00:19:16.975587 master-0 kubenswrapper[4059]: I0308 00:19:16.975559 4059 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 00:19:16.976340 master-0 kubenswrapper[4059]: W0308 00:19:16.976296 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:16.976449 master-0 kubenswrapper[4059]: E0308 00:19:16.976432 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:16.976520 master-0 kubenswrapper[4059]: W0308 00:19:16.976302 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:16.976554 master-0 kubenswrapper[4059]: E0308 00:19:16.976529 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:16.977594 master-0 kubenswrapper[4059]: I0308 00:19:16.977567 4059 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 00:19:16.977791 master-0 kubenswrapper[4059]: I0308 00:19:16.977771 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 00:19:16.977791 master-0 kubenswrapper[4059]: I0308 00:19:16.977792 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977800 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977807 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977813 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977822 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977829 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977835 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977843 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977849 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 00:19:16.977880 master-0 kubenswrapper[4059]: I0308 00:19:16.977875 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 00:19:16.978124 master-0 kubenswrapper[4059]: I0308 00:19:16.978073 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 00:19:16.978962 master-0 kubenswrapper[4059]: I0308 00:19:16.978948 4059 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 00:19:16.979433 master-0 kubenswrapper[4059]: I0308 00:19:16.979409 4059 server.go:1280] "Started kubelet" Mar 08 00:19:16.982785 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 00:19:16.983858 master-0 kubenswrapper[4059]: I0308 00:19:16.983814 4059 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 00:19:16.983984 master-0 kubenswrapper[4059]: I0308 00:19:16.983859 4059 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 00:19:16.983984 master-0 kubenswrapper[4059]: I0308 00:19:16.983942 4059 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 00:19:16.984343 master-0 kubenswrapper[4059]: I0308 00:19:16.984320 4059 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 00:19:16.985063 master-0 kubenswrapper[4059]: I0308 00:19:16.985026 4059 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 00:19:16.985063 master-0 kubenswrapper[4059]: I0308 00:19:16.985059 4059 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 00:19:16.985352 master-0 kubenswrapper[4059]: I0308 00:19:16.985312 4059 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 00:19:16.985352 master-0 kubenswrapper[4059]: I0308 00:19:16.985341 4059 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 00:19:16.985499 master-0 kubenswrapper[4059]: I0308 00:19:16.985479 4059 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 00:19:16.985612 master-0 kubenswrapper[4059]: I0308 00:19:16.985594 4059 reconstruct.go:97] "Volume reconstruction finished" Mar 08 00:19:16.985612 master-0 kubenswrapper[4059]: I0308 00:19:16.985607 4059 reconciler.go:26] "Reconciler: start to sync state" Mar 08 00:19:16.985725 master-0 kubenswrapper[4059]: I0308 00:19:16.985678 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:16.985876 master-0 kubenswrapper[4059]: E0308 00:19:16.985770 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:16.986015 master-0 kubenswrapper[4059]: E0308 00:19:16.985972 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 00:19:16.992031 master-0 kubenswrapper[4059]: I0308 00:19:16.991546 4059 factory.go:55] Registering systemd factory Mar 08 00:19:16.992031 master-0 kubenswrapper[4059]: I0308 00:19:16.991910 4059 factory.go:221] Registration of the systemd container factory successfully Mar 08 00:19:16.992031 master-0 kubenswrapper[4059]: W0308 00:19:16.991801 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:16.992031 master-0 kubenswrapper[4059]: E0308 00:19:16.991992 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:16.992322 master-0 kubenswrapper[4059]: I0308 00:19:16.992193 4059 server.go:449] "Adding debug handlers to kubelet server" Mar 08 00:19:16.993486 master-0 kubenswrapper[4059]: I0308 00:19:16.993463 4059 factory.go:153] Registering CRI-O factory Mar 08 00:19:16.993486 master-0 kubenswrapper[4059]: I0308 00:19:16.993487 4059 factory.go:221] Registration of the crio container factory successfully Mar 08 00:19:16.993559 master-0 kubenswrapper[4059]: I0308 00:19:16.993549 4059 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 00:19:16.993587 master-0 kubenswrapper[4059]: I0308 00:19:16.993571 4059 factory.go:103] Registering Raw factory Mar 08 00:19:16.993612 master-0 kubenswrapper[4059]: I0308 00:19:16.993595 4059 manager.go:1196] Started watching for new ooms in manager Mar 08 00:19:16.994169 master-0 kubenswrapper[4059]: I0308 00:19:16.994151 4059 manager.go:319] Starting recovery of all containers Mar 08 00:19:16.997812 master-0 kubenswrapper[4059]: E0308 00:19:16.994195 4059 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189ab5acc5b04e37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,LastTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:17.001295 master-0 kubenswrapper[4059]: E0308 00:19:17.001249 4059 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 08 00:19:17.012944 master-0 kubenswrapper[4059]: I0308 00:19:17.012886 4059 manager.go:324] Recovery completed Mar 08 00:19:17.027451 master-0 kubenswrapper[4059]: I0308 00:19:17.027404 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.029916 master-0 kubenswrapper[4059]: I0308 00:19:17.029890 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.029973 master-0 kubenswrapper[4059]: I0308 00:19:17.029930 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.029973 master-0 kubenswrapper[4059]: I0308 00:19:17.029944 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.031252 master-0 kubenswrapper[4059]: I0308 00:19:17.031230 4059 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 00:19:17.031252 master-0 kubenswrapper[4059]: I0308 00:19:17.031246 4059 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 00:19:17.031357 master-0 kubenswrapper[4059]: I0308 00:19:17.031264 4059 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:19:17.037074 master-0 kubenswrapper[4059]: I0308 00:19:17.037041 4059 policy_none.go:49] "None policy: Start" Mar 08 00:19:17.037906 master-0 kubenswrapper[4059]: I0308 00:19:17.037868 4059 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 00:19:17.037906 master-0 kubenswrapper[4059]: I0308 00:19:17.037894 4059 state_mem.go:35] "Initializing new in-memory state store" Mar 08 00:19:17.086462 master-0 kubenswrapper[4059]: E0308 00:19:17.086434 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104042 4059 manager.go:334] "Starting Device Plugin manager" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104128 4059 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104141 4059 server.go:79] "Starting device plugin registration server" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104634 4059 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104671 4059 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.104854 4059 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.105069 4059 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.105077 4059 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: E0308 00:19:17.106249 4059 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.131317 4059 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.133045 4059 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.133129 4059 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: I0308 00:19:17.133170 4059 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: E0308 00:19:17.133300 4059 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: W0308 00:19:17.134052 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:17.146336 master-0 kubenswrapper[4059]: E0308 00:19:17.134126 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:17.187824 master-0 kubenswrapper[4059]: E0308 00:19:17.187655 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 00:19:17.205820 master-0 kubenswrapper[4059]: I0308 00:19:17.205768 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.208085 master-0 kubenswrapper[4059]: I0308 00:19:17.208053 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.208142 master-0 kubenswrapper[4059]: I0308 00:19:17.208088 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.208142 master-0 kubenswrapper[4059]: I0308 00:19:17.208099 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.208142 master-0 kubenswrapper[4059]: I0308 00:19:17.208127 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:17.208732 master-0 kubenswrapper[4059]: E0308 00:19:17.208690 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:17.233968 master-0 kubenswrapper[4059]: I0308 00:19:17.233900 4059 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 00:19:17.234134 master-0 kubenswrapper[4059]: I0308 00:19:17.233994 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.235183 master-0 kubenswrapper[4059]: I0308 00:19:17.235145 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.235234 master-0 kubenswrapper[4059]: I0308 00:19:17.235192 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.235234 master-0 kubenswrapper[4059]: I0308 00:19:17.235221 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.235364 master-0 kubenswrapper[4059]: I0308 00:19:17.235338 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.235595 master-0 kubenswrapper[4059]: I0308 00:19:17.235572 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.235626 master-0 kubenswrapper[4059]: I0308 00:19:17.235602 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.236370 master-0 kubenswrapper[4059]: I0308 00:19:17.236334 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.236418 master-0 kubenswrapper[4059]: I0308 00:19:17.236373 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.236418 master-0 kubenswrapper[4059]: I0308 00:19:17.236382 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.236492 master-0 kubenswrapper[4059]: I0308 00:19:17.236343 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.236492 master-0 kubenswrapper[4059]: I0308 00:19:17.236448 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.236492 master-0 kubenswrapper[4059]: I0308 00:19:17.236458 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.236567 master-0 kubenswrapper[4059]: I0308 00:19:17.236503 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.236783 master-0 kubenswrapper[4059]: I0308 00:19:17.236747 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.236826 master-0 kubenswrapper[4059]: I0308 00:19:17.236787 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.237283 master-0 kubenswrapper[4059]: I0308 00:19:17.237265 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.237337 master-0 kubenswrapper[4059]: I0308 00:19:17.237291 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.237337 master-0 kubenswrapper[4059]: I0308 00:19:17.237273 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.237337 master-0 kubenswrapper[4059]: I0308 00:19:17.237301 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.237337 master-0 kubenswrapper[4059]: I0308 00:19:17.237313 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.237337 master-0 kubenswrapper[4059]: I0308 00:19:17.237322 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.237462 master-0 kubenswrapper[4059]: I0308 00:19:17.237376 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.237520 master-0 kubenswrapper[4059]: I0308 00:19:17.237500 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.237550 master-0 kubenswrapper[4059]: I0308 00:19:17.237531 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.237808 master-0 kubenswrapper[4059]: I0308 00:19:17.237782 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.237808 master-0 kubenswrapper[4059]: I0308 00:19:17.237804 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.237876 master-0 kubenswrapper[4059]: I0308 00:19:17.237812 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.237902 master-0 kubenswrapper[4059]: I0308 00:19:17.237884 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.238016 master-0 kubenswrapper[4059]: I0308 00:19:17.237995 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.238049 master-0 kubenswrapper[4059]: I0308 00:19:17.238023 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.238079 master-0 kubenswrapper[4059]: I0308 00:19:17.238054 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.238079 master-0 kubenswrapper[4059]: I0308 00:19:17.238075 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.238128 master-0 kubenswrapper[4059]: I0308 00:19:17.238084 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.238357 master-0 kubenswrapper[4059]: I0308 00:19:17.238313 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.238357 master-0 kubenswrapper[4059]: I0308 00:19:17.238333 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.238357 master-0 kubenswrapper[4059]: I0308 00:19:17.238340 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.238454 master-0 kubenswrapper[4059]: I0308 00:19:17.238417 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.238454 master-0 kubenswrapper[4059]: I0308 00:19:17.238434 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.238578 master-0 kubenswrapper[4059]: I0308 00:19:17.238530 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.238612 master-0 kubenswrapper[4059]: I0308 00:19:17.238581 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.238612 master-0 kubenswrapper[4059]: I0308 00:19:17.238593 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.238978 master-0 kubenswrapper[4059]: I0308 00:19:17.238879 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.238978 master-0 kubenswrapper[4059]: I0308 00:19:17.238902 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.238978 master-0 kubenswrapper[4059]: I0308 00:19:17.238912 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.287524 master-0 kubenswrapper[4059]: I0308 00:19:17.287484 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.287524 master-0 kubenswrapper[4059]: I0308 00:19:17.287515 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.287653 master-0 kubenswrapper[4059]: I0308 00:19:17.287539 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.287653 master-0 kubenswrapper[4059]: I0308 00:19:17.287557 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.287653 master-0 kubenswrapper[4059]: I0308 00:19:17.287573 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.287653 master-0 kubenswrapper[4059]: I0308 00:19:17.287589 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.287653 master-0 kubenswrapper[4059]: I0308 00:19:17.287606 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287655 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287687 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287708 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287726 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287775 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287819 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287842 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287861 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287879 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.287899 master-0 kubenswrapper[4059]: I0308 00:19:17.287906 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388252 master-0 kubenswrapper[4059]: I0308 00:19:17.388216 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.388344 master-0 kubenswrapper[4059]: I0308 00:19:17.388254 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388344 master-0 kubenswrapper[4059]: I0308 00:19:17.388274 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388344 master-0 kubenswrapper[4059]: I0308 00:19:17.388291 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.388344 master-0 kubenswrapper[4059]: I0308 00:19:17.388306 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388489 master-0 kubenswrapper[4059]: I0308 00:19:17.388427 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388489 master-0 kubenswrapper[4059]: I0308 00:19:17.388475 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.388571 master-0 kubenswrapper[4059]: I0308 00:19:17.388508 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388571 master-0 kubenswrapper[4059]: I0308 00:19:17.388515 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.388571 master-0 kubenswrapper[4059]: I0308 00:19:17.388490 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388571 master-0 kubenswrapper[4059]: I0308 00:19:17.388555 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388584 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388592 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388607 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388622 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388624 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388641 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388702 master-0 kubenswrapper[4059]: I0308 00:19:17.388665 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388702 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388731 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388744 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388759 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388773 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388783 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388801 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388807 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388823 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388839 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388840 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388865 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388889 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388909 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388923 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.388937 master-0 kubenswrapper[4059]: I0308 00:19:17.388933 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.409346 master-0 kubenswrapper[4059]: I0308 00:19:17.409297 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.410484 master-0 kubenswrapper[4059]: I0308 00:19:17.410459 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.410536 master-0 kubenswrapper[4059]: I0308 00:19:17.410486 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.410536 master-0 kubenswrapper[4059]: I0308 00:19:17.410495 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.410536 master-0 kubenswrapper[4059]: I0308 00:19:17.410534 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:17.411347 master-0 kubenswrapper[4059]: E0308 00:19:17.411315 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:17.571193 master-0 kubenswrapper[4059]: I0308 00:19:17.571130 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:19:17.776581 master-0 kubenswrapper[4059]: I0308 00:19:17.585535 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:19:17.776581 master-0 kubenswrapper[4059]: E0308 00:19:17.588884 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 00:19:17.776581 master-0 kubenswrapper[4059]: I0308 00:19:17.627281 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:17.776581 master-0 kubenswrapper[4059]: I0308 00:19:17.671351 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:17.776581 master-0 kubenswrapper[4059]: I0308 00:19:17.679274 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:19:17.812409 master-0 kubenswrapper[4059]: I0308 00:19:17.812268 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:17.813491 master-0 kubenswrapper[4059]: I0308 00:19:17.813433 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:17.813559 master-0 kubenswrapper[4059]: I0308 00:19:17.813494 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:17.813559 master-0 kubenswrapper[4059]: I0308 00:19:17.813510 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:17.813633 master-0 kubenswrapper[4059]: I0308 00:19:17.813572 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:17.814547 master-0 kubenswrapper[4059]: E0308 00:19:17.814494 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:17.827800 master-0 kubenswrapper[4059]: W0308 00:19:17.827656 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:17.827800 master-0 kubenswrapper[4059]: E0308 00:19:17.827741 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:17.914871 master-0 kubenswrapper[4059]: W0308 00:19:17.914722 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:17.914871 master-0 kubenswrapper[4059]: E0308 00:19:17.914812 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:17.986844 master-0 kubenswrapper[4059]: I0308 00:19:17.986771 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:18.232353 master-0 kubenswrapper[4059]: W0308 00:19:18.232194 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:18.232353 master-0 kubenswrapper[4059]: E0308 00:19:18.232312 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:18.267106 master-0 kubenswrapper[4059]: W0308 00:19:18.266986 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:18.267106 master-0 kubenswrapper[4059]: E0308 00:19:18.267065 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:18.390577 master-0 kubenswrapper[4059]: E0308 00:19:18.390429 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 00:19:18.426719 master-0 kubenswrapper[4059]: W0308 00:19:18.426639 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed WatchSource:0}: Error finding container c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed: Status 404 returned error can't find the container with id c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed Mar 08 00:19:18.434815 master-0 kubenswrapper[4059]: I0308 00:19:18.434775 4059 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:19:18.456727 master-0 kubenswrapper[4059]: W0308 00:19:18.456257 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a WatchSource:0}: Error finding container 4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a: Status 404 returned error can't find the container with id 4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a Mar 08 00:19:18.479966 master-0 kubenswrapper[4059]: W0308 00:19:18.479906 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7 WatchSource:0}: Error finding container d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7: Status 404 returned error can't find the container with id d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7 Mar 08 00:19:18.502906 master-0 kubenswrapper[4059]: W0308 00:19:18.502859 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d WatchSource:0}: Error finding container 0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d: Status 404 returned error can't find the container with id 0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d Mar 08 00:19:18.615010 master-0 kubenswrapper[4059]: I0308 00:19:18.614930 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:18.616549 master-0 kubenswrapper[4059]: I0308 00:19:18.616521 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:18.616617 master-0 kubenswrapper[4059]: I0308 00:19:18.616562 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:18.616617 master-0 kubenswrapper[4059]: I0308 00:19:18.616579 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:18.616674 master-0 kubenswrapper[4059]: I0308 00:19:18.616626 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:18.617573 master-0 kubenswrapper[4059]: E0308 00:19:18.617526 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:18.859643 master-0 kubenswrapper[4059]: E0308 00:19:18.859487 4059 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189ab5acc5b04e37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,LastTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:18.987194 master-0 kubenswrapper[4059]: I0308 00:19:18.987109 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:19.118451 master-0 kubenswrapper[4059]: I0308 00:19:19.118345 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:19:19.119624 master-0 kubenswrapper[4059]: E0308 00:19:19.119597 4059 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:19.139585 master-0 kubenswrapper[4059]: I0308 00:19:19.139455 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed"} Mar 08 00:19:19.140559 master-0 kubenswrapper[4059]: I0308 00:19:19.140528 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d"} Mar 08 00:19:19.141510 master-0 kubenswrapper[4059]: I0308 00:19:19.141477 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7"} Mar 08 00:19:19.142611 master-0 kubenswrapper[4059]: I0308 00:19:19.142578 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a"} Mar 08 00:19:19.143742 master-0 kubenswrapper[4059]: I0308 00:19:19.143711 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651"} Mar 08 00:19:19.987217 master-0 kubenswrapper[4059]: I0308 00:19:19.987108 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:19.991988 master-0 kubenswrapper[4059]: E0308 00:19:19.991945 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 00:19:20.218005 master-0 kubenswrapper[4059]: I0308 00:19:20.217932 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:20.219053 master-0 kubenswrapper[4059]: I0308 00:19:20.219030 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:20.219114 master-0 kubenswrapper[4059]: I0308 00:19:20.219057 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:20.219114 master-0 kubenswrapper[4059]: I0308 00:19:20.219067 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:20.219114 master-0 kubenswrapper[4059]: I0308 00:19:20.219109 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:20.220514 master-0 kubenswrapper[4059]: E0308 00:19:20.220479 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:20.940242 master-0 kubenswrapper[4059]: W0308 00:19:20.884858 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:20.940242 master-0 kubenswrapper[4059]: E0308 00:19:20.884917 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:20.940242 master-0 kubenswrapper[4059]: W0308 00:19:20.885194 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:20.940242 master-0 kubenswrapper[4059]: E0308 00:19:20.885237 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:20.947472 master-0 kubenswrapper[4059]: W0308 00:19:20.947426 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:20.947907 master-0 kubenswrapper[4059]: E0308 00:19:20.947483 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:20.987338 master-0 kubenswrapper[4059]: I0308 00:19:20.987294 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:21.121902 master-0 kubenswrapper[4059]: W0308 00:19:21.121820 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:21.121902 master-0 kubenswrapper[4059]: E0308 00:19:21.121868 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:21.152238 master-0 kubenswrapper[4059]: I0308 00:19:21.152146 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460"} Mar 08 00:19:21.152238 master-0 kubenswrapper[4059]: I0308 00:19:21.152227 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:21.152959 master-0 kubenswrapper[4059]: I0308 00:19:21.152918 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:21.152959 master-0 kubenswrapper[4059]: I0308 00:19:21.152956 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:21.153050 master-0 kubenswrapper[4059]: I0308 00:19:21.152966 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:21.987725 master-0 kubenswrapper[4059]: I0308 00:19:21.987665 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:22.155715 master-0 kubenswrapper[4059]: I0308 00:19:22.155654 4059 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460" exitCode=0 Mar 08 00:19:22.155900 master-0 kubenswrapper[4059]: I0308 00:19:22.155705 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460"} Mar 08 00:19:22.155900 master-0 kubenswrapper[4059]: I0308 00:19:22.155885 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:22.156619 master-0 kubenswrapper[4059]: I0308 00:19:22.156594 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:22.156619 master-0 kubenswrapper[4059]: I0308 00:19:22.156616 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:22.156721 master-0 kubenswrapper[4059]: I0308 00:19:22.156624 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:23.157976 master-0 kubenswrapper[4059]: I0308 00:19:23.157926 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"da60beba23659d143e9020dc0409825d88a4d10b35b445c12b13ae8fc1310bdf"} Mar 08 00:19:23.329503 master-0 kubenswrapper[4059]: I0308 00:19:23.329433 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:23.329661 master-0 kubenswrapper[4059]: E0308 00:19:23.329598 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 00:19:23.374101 master-0 kubenswrapper[4059]: I0308 00:19:23.374058 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:19:23.375239 master-0 kubenswrapper[4059]: E0308 00:19:23.375185 4059 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:23.421181 master-0 kubenswrapper[4059]: I0308 00:19:23.421106 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:23.421904 master-0 kubenswrapper[4059]: I0308 00:19:23.421886 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:23.421962 master-0 kubenswrapper[4059]: I0308 00:19:23.421911 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:23.421962 master-0 kubenswrapper[4059]: I0308 00:19:23.421919 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:23.421962 master-0 kubenswrapper[4059]: I0308 00:19:23.421962 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:23.422609 master-0 kubenswrapper[4059]: E0308 00:19:23.422575 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 00:19:23.987246 master-0 kubenswrapper[4059]: I0308 00:19:23.987163 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:24.161697 master-0 kubenswrapper[4059]: I0308 00:19:24.161650 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"058ee36ca49c13759baef5f6c082b6670cbaa179a3750909f129254d20e8ecf0"} Mar 08 00:19:24.162174 master-0 kubenswrapper[4059]: I0308 00:19:24.161792 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:24.162652 master-0 kubenswrapper[4059]: I0308 00:19:24.162631 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:24.162713 master-0 kubenswrapper[4059]: I0308 00:19:24.162662 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:24.162713 master-0 kubenswrapper[4059]: I0308 00:19:24.162674 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:24.857636 master-0 kubenswrapper[4059]: W0308 00:19:24.857535 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:24.857636 master-0 kubenswrapper[4059]: E0308 00:19:24.857607 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:24.987632 master-0 kubenswrapper[4059]: I0308 00:19:24.987483 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:25.167246 master-0 kubenswrapper[4059]: I0308 00:19:25.167173 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 00:19:25.167824 master-0 kubenswrapper[4059]: I0308 00:19:25.167781 4059 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="058ee36ca49c13759baef5f6c082b6670cbaa179a3750909f129254d20e8ecf0" exitCode=1 Mar 08 00:19:25.167876 master-0 kubenswrapper[4059]: I0308 00:19:25.167845 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"058ee36ca49c13759baef5f6c082b6670cbaa179a3750909f129254d20e8ecf0"} Mar 08 00:19:25.167943 master-0 kubenswrapper[4059]: I0308 00:19:25.167926 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:25.168570 master-0 kubenswrapper[4059]: I0308 00:19:25.168537 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:25.168570 master-0 kubenswrapper[4059]: I0308 00:19:25.168560 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:25.168570 master-0 kubenswrapper[4059]: I0308 00:19:25.168570 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:25.168903 master-0 kubenswrapper[4059]: I0308 00:19:25.168801 4059 scope.go:117] "RemoveContainer" containerID="058ee36ca49c13759baef5f6c082b6670cbaa179a3750909f129254d20e8ecf0" Mar 08 00:19:25.170303 master-0 kubenswrapper[4059]: I0308 00:19:25.170261 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"b999c6f84ef35141ea9d9157df896d14bb08340f5b7476591f3ed6362f2a6196"} Mar 08 00:19:25.170385 master-0 kubenswrapper[4059]: I0308 00:19:25.170362 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:25.170967 master-0 kubenswrapper[4059]: I0308 00:19:25.170916 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:25.170967 master-0 kubenswrapper[4059]: I0308 00:19:25.170940 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:25.170967 master-0 kubenswrapper[4059]: I0308 00:19:25.170949 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:25.643569 master-0 kubenswrapper[4059]: W0308 00:19:25.642742 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:25.643569 master-0 kubenswrapper[4059]: E0308 00:19:25.642862 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:25.987413 master-0 kubenswrapper[4059]: I0308 00:19:25.987304 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:26.061324 master-0 kubenswrapper[4059]: W0308 00:19:26.061267 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:26.061484 master-0 kubenswrapper[4059]: E0308 00:19:26.061334 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:26.174033 master-0 kubenswrapper[4059]: I0308 00:19:26.173947 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 00:19:26.174473 master-0 kubenswrapper[4059]: I0308 00:19:26.174416 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:26.174912 master-0 kubenswrapper[4059]: I0308 00:19:26.174596 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:26.174912 master-0 kubenswrapper[4059]: I0308 00:19:26.174595 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d"} Mar 08 00:19:26.174996 master-0 kubenswrapper[4059]: I0308 00:19:26.174959 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:26.174996 master-0 kubenswrapper[4059]: I0308 00:19:26.174987 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:26.175049 master-0 kubenswrapper[4059]: I0308 00:19:26.175000 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:26.175173 master-0 kubenswrapper[4059]: I0308 00:19:26.175151 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:26.175221 master-0 kubenswrapper[4059]: I0308 00:19:26.175181 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:26.175221 master-0 kubenswrapper[4059]: I0308 00:19:26.175192 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:26.576279 master-0 kubenswrapper[4059]: W0308 00:19:26.576150 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:26.576279 master-0 kubenswrapper[4059]: E0308 00:19:26.576244 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:19:26.986755 master-0 kubenswrapper[4059]: I0308 00:19:26.986630 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:27.106403 master-0 kubenswrapper[4059]: E0308 00:19:27.106363 4059 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 00:19:27.175954 master-0 kubenswrapper[4059]: I0308 00:19:27.175923 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:27.176856 master-0 kubenswrapper[4059]: I0308 00:19:27.176832 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:27.176926 master-0 kubenswrapper[4059]: I0308 00:19:27.176875 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:27.176926 master-0 kubenswrapper[4059]: I0308 00:19:27.176890 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:27.987381 master-0 kubenswrapper[4059]: I0308 00:19:27.987338 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 00:19:28.180524 master-0 kubenswrapper[4059]: I0308 00:19:28.179987 4059 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0" exitCode=0 Mar 08 00:19:28.180923 master-0 kubenswrapper[4059]: I0308 00:19:28.180078 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:28.180923 master-0 kubenswrapper[4059]: I0308 00:19:28.180098 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0"} Mar 08 00:19:28.181405 master-0 kubenswrapper[4059]: I0308 00:19:28.181373 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:28.181405 master-0 kubenswrapper[4059]: I0308 00:19:28.181403 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:28.181484 master-0 kubenswrapper[4059]: I0308 00:19:28.181413 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:28.182446 master-0 kubenswrapper[4059]: I0308 00:19:28.182413 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2"} Mar 08 00:19:28.182485 master-0 kubenswrapper[4059]: I0308 00:19:28.182471 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:28.183114 master-0 kubenswrapper[4059]: I0308 00:19:28.183083 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:28.183114 master-0 kubenswrapper[4059]: I0308 00:19:28.183112 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:28.183181 master-0 kubenswrapper[4059]: I0308 00:19:28.183126 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:28.183885 master-0 kubenswrapper[4059]: I0308 00:19:28.183770 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:28.184322 master-0 kubenswrapper[4059]: I0308 00:19:28.184278 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:28.184322 master-0 kubenswrapper[4059]: I0308 00:19:28.184298 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:28.184322 master-0 kubenswrapper[4059]: I0308 00:19:28.184306 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:28.184547 master-0 kubenswrapper[4059]: I0308 00:19:28.184525 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 00:19:28.185154 master-0 kubenswrapper[4059]: I0308 00:19:28.185114 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 00:19:28.185540 master-0 kubenswrapper[4059]: I0308 00:19:28.185510 4059 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d" exitCode=1 Mar 08 00:19:28.185586 master-0 kubenswrapper[4059]: I0308 00:19:28.185566 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d"} Mar 08 00:19:28.185626 master-0 kubenswrapper[4059]: I0308 00:19:28.185616 4059 scope.go:117] "RemoveContainer" containerID="058ee36ca49c13759baef5f6c082b6670cbaa179a3750909f129254d20e8ecf0" Mar 08 00:19:28.185708 master-0 kubenswrapper[4059]: I0308 00:19:28.185678 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:28.188661 master-0 kubenswrapper[4059]: I0308 00:19:28.188618 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:28.188705 master-0 kubenswrapper[4059]: I0308 00:19:28.188685 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:28.188733 master-0 kubenswrapper[4059]: I0308 00:19:28.188711 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:28.189048 master-0 kubenswrapper[4059]: I0308 00:19:28.189009 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4"} Mar 08 00:19:28.189277 master-0 kubenswrapper[4059]: I0308 00:19:28.189252 4059 scope.go:117] "RemoveContainer" containerID="e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d" Mar 08 00:19:28.189523 master-0 kubenswrapper[4059]: E0308 00:19:28.189484 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 00:19:29.193672 master-0 kubenswrapper[4059]: I0308 00:19:29.193632 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 00:19:29.195531 master-0 kubenswrapper[4059]: I0308 00:19:29.195508 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:29.195811 master-0 kubenswrapper[4059]: I0308 00:19:29.195790 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef"} Mar 08 00:19:29.196081 master-0 kubenswrapper[4059]: I0308 00:19:29.196066 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:29.196125 master-0 kubenswrapper[4059]: I0308 00:19:29.196087 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:29.196125 master-0 kubenswrapper[4059]: I0308 00:19:29.196095 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:29.781254 master-0 kubenswrapper[4059]: I0308 00:19:29.779314 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:29.781254 master-0 kubenswrapper[4059]: E0308 00:19:29.779315 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc5b04e37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,LastTimestamp:2026-03-08 00:19:16.979379767 +0000 UTC m=+0.690979289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.781254 master-0 kubenswrapper[4059]: E0308 00:19:29.780267 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]" interval="7s" Mar 08 00:19:29.791337 master-0 kubenswrapper[4059]: E0308 00:19:29.791223 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.798569 master-0 kubenswrapper[4059]: E0308 00:19:29.798449 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.809479 master-0 kubenswrapper[4059]: E0308 00:19:29.809350 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.813889 master-0 kubenswrapper[4059]: E0308 00:19:29.813760 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acce84b320 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.12751696 +0000 UTC m=+0.839116482,LastTimestamp:2026-03-08 00:19:17.12751696 +0000 UTC m=+0.839116482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.819031 master-0 kubenswrapper[4059]: E0308 00:19:29.818936 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.208075919 +0000 UTC m=+0.919675441,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.823053 master-0 kubenswrapper[4059]: I0308 00:19:29.823020 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:29.823605 master-0 kubenswrapper[4059]: E0308 00:19:29.823310 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.208095464 +0000 UTC m=+0.919694986,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.823809 master-0 kubenswrapper[4059]: I0308 00:19:29.823792 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:29.823857 master-0 kubenswrapper[4059]: I0308 00:19:29.823815 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:29.823857 master-0 kubenswrapper[4059]: I0308 00:19:29.823823 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:29.823914 master-0 kubenswrapper[4059]: I0308 00:19:29.823861 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:29.829114 master-0 kubenswrapper[4059]: E0308 00:19:29.829081 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 00:19:29.829281 master-0 kubenswrapper[4059]: E0308 00:19:29.829179 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.20810375 +0000 UTC m=+0.919703282,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.835222 master-0 kubenswrapper[4059]: E0308 00:19:29.834325 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.235174789 +0000 UTC m=+0.946774321,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.839234 master-0 kubenswrapper[4059]: E0308 00:19:29.838110 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.235216792 +0000 UTC m=+0.946816324,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.843760 master-0 kubenswrapper[4059]: E0308 00:19:29.843670 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.23522723 +0000 UTC m=+0.946826762,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.848193 master-0 kubenswrapper[4059]: E0308 00:19:29.848102 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.236365112 +0000 UTC m=+0.947964634,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.852826 master-0 kubenswrapper[4059]: E0308 00:19:29.852751 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.236379723 +0000 UTC m=+0.947979245,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.858178 master-0 kubenswrapper[4059]: E0308 00:19:29.857985 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.23638867 +0000 UTC m=+0.947988192,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.863219 master-0 kubenswrapper[4059]: E0308 00:19:29.863079 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.236442071 +0000 UTC m=+0.948041583,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.868619 master-0 kubenswrapper[4059]: E0308 00:19:29.868508 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.236455601 +0000 UTC m=+0.948055124,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.891158 master-0 kubenswrapper[4059]: E0308 00:19:29.890614 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.236463227 +0000 UTC m=+0.948062739,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.896787 master-0 kubenswrapper[4059]: E0308 00:19:29.896255 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.237286609 +0000 UTC m=+0.948886131,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.901728 master-0 kubenswrapper[4059]: E0308 00:19:29.901612 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.237297517 +0000 UTC m=+0.948897039,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.906676 master-0 kubenswrapper[4059]: E0308 00:19:29.906568 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.237304322 +0000 UTC m=+0.948903834,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.910980 master-0 kubenswrapper[4059]: E0308 00:19:29.910904 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.237306394 +0000 UTC m=+0.948905916,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.914950 master-0 kubenswrapper[4059]: E0308 00:19:29.914789 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.237318833 +0000 UTC m=+0.948918355,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.920638 master-0 kubenswrapper[4059]: E0308 00:19:29.920501 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3ee9b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3ee9b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029949083 +0000 UTC m=+0.741548615,LastTimestamp:2026-03-08 00:19:17.237326599 +0000 UTC m=+0.948926121,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.926624 master-0 kubenswrapper[4059]: E0308 00:19:29.926516 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b37ae6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b37ae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029919462 +0000 UTC m=+0.741518974,LastTimestamp:2026-03-08 00:19:17.237796199 +0000 UTC m=+0.949395722,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.942093 master-0 kubenswrapper[4059]: E0308 00:19:29.941300 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ab5acc8b3c6f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ab5acc8b3c6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:17.029938932 +0000 UTC m=+0.741538454,LastTimestamp:2026-03-08 00:19:17.23780946 +0000 UTC m=+0.949408982,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.965214 master-0 kubenswrapper[4059]: E0308 00:19:29.964998 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ad1c6f08c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:18.434719945 +0000 UTC m=+2.146319468,LastTimestamp:2026-03-08 00:19:18.434719945 +0000 UTC m=+2.146319468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.978292 master-0 kubenswrapper[4059]: E0308 00:19:29.978090 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ab5ad1db4e9bf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:18.456076735 +0000 UTC m=+2.167676268,LastTimestamp:2026-03-08 00:19:18.456076735 +0000 UTC m=+2.167676268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.993763 master-0 kubenswrapper[4059]: E0308 00:19:29.989187 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5ad1dd8c25c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:18.458425948 +0000 UTC m=+2.170025470,LastTimestamp:2026-03-08 00:19:18.458425948 +0000 UTC m=+2.170025470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:29.994742 master-0 kubenswrapper[4059]: I0308 00:19:29.994710 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:29.994930 master-0 kubenswrapper[4059]: E0308 00:19:29.994821 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ad1f53d6dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:18.48326934 +0000 UTC m=+2.194868872,LastTimestamp:2026-03-08 00:19:18.48326934 +0000 UTC m=+2.194868872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.010733 master-0 kubenswrapper[4059]: E0308 00:19:30.010576 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5ad20ac3109 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:18.505836809 +0000 UTC m=+2.217436331,LastTimestamp:2026-03-08 00:19:18.505836809 +0000 UTC m=+2.217436331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.015622 master-0 kubenswrapper[4059]: E0308 00:19:30.015497 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ad79421985 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 1.557s (1.557s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:19.992056197 +0000 UTC m=+3.703655719,LastTimestamp:2026-03-08 00:19:19.992056197 +0000 UTC m=+3.703655719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.019702 master-0 kubenswrapper[4059]: E0308 00:19:30.019554 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ad83647b7d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:20.162081661 +0000 UTC m=+3.873681183,LastTimestamp:2026-03-08 00:19:20.162081661 +0000 UTC m=+3.873681183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.023779 master-0 kubenswrapper[4059]: E0308 00:19:30.023641 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ad84124338 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:20.17347052 +0000 UTC m=+3.885070042,LastTimestamp:2026-03-08 00:19:20.17347052 +0000 UTC m=+3.885070042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.030257 master-0 kubenswrapper[4059]: E0308 00:19:30.030050 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae06a93a93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:22.364402323 +0000 UTC m=+6.076001845,LastTimestamp:2026-03-08 00:19:22.364402323 +0000 UTC m=+6.076001845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.034627 master-0 kubenswrapper[4059]: E0308 00:19:30.034458 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae07636295 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 3.893s (3.893s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:22.376602261 +0000 UTC m=+6.088201783,LastTimestamp:2026-03-08 00:19:22.376602261 +0000 UTC m=+6.088201783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.040312 master-0 kubenswrapper[4059]: E0308 00:19:30.040211 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae1bc8e490 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:22.718798992 +0000 UTC m=+6.430398514,LastTimestamp:2026-03-08 00:19:22.718798992 +0000 UTC m=+6.430398514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.045429 master-0 kubenswrapper[4059]: E0308 00:19:30.045316 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae406214ba openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.332818106 +0000 UTC m=+7.044417618,LastTimestamp:2026-03-08 00:19:23.332818106 +0000 UTC m=+7.044417618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.052650 master-0 kubenswrapper[4059]: E0308 00:19:30.052449 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae40794086 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.334336646 +0000 UTC m=+7.045936168,LastTimestamp:2026-03-08 00:19:23.334336646 +0000 UTC m=+7.045936168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.073372 master-0 kubenswrapper[4059]: E0308 00:19:30.058273 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae40983ecb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.336367819 +0000 UTC m=+7.047967341,LastTimestamp:2026-03-08 00:19:23.336367819 +0000 UTC m=+7.047967341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.073372 master-0 kubenswrapper[4059]: E0308 00:19:30.068625 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae43c80992 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.38983157 +0000 UTC m=+7.101431092,LastTimestamp:2026-03-08 00:19:23.38983157 +0000 UTC m=+7.101431092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.076285 master-0 kubenswrapper[4059]: E0308 00:19:30.075026 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae8e8e0983 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:24.644321667 +0000 UTC m=+8.355921189,LastTimestamp:2026-03-08 00:19:24.644321667 +0000 UTC m=+8.355921189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.081541 master-0 kubenswrapper[4059]: E0308 00:19:30.081415 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5ae9154e053 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:24.690907219 +0000 UTC m=+8.402506741,LastTimestamp:2026-03-08 00:19:24.690907219 +0000 UTC m=+8.402506741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.086235 master-0 kubenswrapper[4059]: E0308 00:19:30.086075 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae06a93a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae06a93a93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:22.364402323 +0000 UTC m=+6.076001845,LastTimestamp:2026-03-08 00:19:25.170938011 +0000 UTC m=+8.882537552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.091166 master-0 kubenswrapper[4059]: E0308 00:19:30.091034 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae406214ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae406214ba openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.332818106 +0000 UTC m=+7.044417618,LastTimestamp:2026-03-08 00:19:25.593708601 +0000 UTC m=+9.305308123,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.102739 master-0 kubenswrapper[4059]: E0308 00:19:30.097760 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae43c80992\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae43c80992 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.38983157 +0000 UTC m=+7.101431092,LastTimestamp:2026-03-08 00:19:25.83034655 +0000 UTC m=+9.541946062,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.102739 master-0 kubenswrapper[4059]: E0308 00:19:30.101957 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ab5af530f8df0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 9.485s (9.485s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:27.941144048 +0000 UTC m=+11.652743570,LastTimestamp:2026-03-08 00:19:27.941144048 +0000 UTC m=+11.652743570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.106095 master-0 kubenswrapper[4059]: E0308 00:19:30.106033 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af5512c458 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 9.516s (9.516s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:27.974909016 +0000 UTC m=+11.686508538,LastTimestamp:2026-03-08 00:19:27.974909016 +0000 UTC m=+11.686508538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.109400 master-0 kubenswrapper[4059]: E0308 00:19:30.109309 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5af56b200c1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 9.496s (9.496s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.002121921 +0000 UTC m=+11.713721443,LastTimestamp:2026-03-08 00:19:28.002121921 +0000 UTC m=+11.713721443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.112701 master-0 kubenswrapper[4059]: E0308 00:19:30.112624 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ab5af5dc405fa kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.120743418 +0000 UTC m=+11.832342940,LastTimestamp:2026-03-08 00:19:28.120743418 +0000 UTC m=+11.832342940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.116816 master-0 kubenswrapper[4059]: E0308 00:19:30.116744 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ab5af5e4aafa0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.129568672 +0000 UTC m=+11.841168194,LastTimestamp:2026-03-08 00:19:28.129568672 +0000 UTC m=+11.841168194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.120959 master-0 kubenswrapper[4059]: E0308 00:19:30.120874 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af5ef2e010 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.14059112 +0000 UTC m=+11.852190642,LastTimestamp:2026-03-08 00:19:28.14059112 +0000 UTC m=+11.852190642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.124661 master-0 kubenswrapper[4059]: E0308 00:19:30.124546 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5af5f021cf4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.141589748 +0000 UTC m=+11.853189270,LastTimestamp:2026-03-08 00:19:28.141589748 +0000 UTC m=+11.853189270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.129303 master-0 kubenswrapper[4059]: E0308 00:19:30.129225 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af5f7fff4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.149839694 +0000 UTC m=+11.861439216,LastTimestamp:2026-03-08 00:19:28.149839694 +0000 UTC m=+11.861439216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.132996 master-0 kubenswrapper[4059]: E0308 00:19:30.132902 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5af5f9262f1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.151044849 +0000 UTC m=+11.862644371,LastTimestamp:2026-03-08 00:19:28.151044849 +0000 UTC m=+11.862644371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.136492 master-0 kubenswrapper[4059]: E0308 00:19:30.136409 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5af5faaee21 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.152653345 +0000 UTC m=+11.864252857,LastTimestamp:2026-03-08 00:19:28.152653345 +0000 UTC m=+11.864252857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.141292 master-0 kubenswrapper[4059]: E0308 00:19:30.141141 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af6184f6ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.18371966 +0000 UTC m=+11.895319182,LastTimestamp:2026-03-08 00:19:28.18371966 +0000 UTC m=+11.895319182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.145284 master-0 kubenswrapper[4059]: E0308 00:19:30.145188 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5af61dc2c4f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.189434959 +0000 UTC m=+11.901034501,LastTimestamp:2026-03-08 00:19:28.189434959 +0000 UTC m=+11.901034501,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.149285 master-0 kubenswrapper[4059]: E0308 00:19:30.149213 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af6d504bec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.381594604 +0000 UTC m=+12.093194126,LastTimestamp:2026-03-08 00:19:28.381594604 +0000 UTC m=+12.093194126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.152843 master-0 kubenswrapper[4059]: E0308 00:19:30.152776 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af6dd9e3e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.390611944 +0000 UTC m=+12.102211466,LastTimestamp:2026-03-08 00:19:28.390611944 +0000 UTC m=+12.102211466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.156912 master-0 kubenswrapper[4059]: E0308 00:19:30.156771 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5af6de82b2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.391547694 +0000 UTC m=+12.103147216,LastTimestamp:2026-03-08 00:19:28.391547694 +0000 UTC m=+12.103147216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:30.991249 master-0 kubenswrapper[4059]: I0308 00:19:30.991186 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:31.165689 master-0 kubenswrapper[4059]: E0308 00:19:31.165493 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5b012ec5f13 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 3.007s (3.007s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.160063763 +0000 UTC m=+14.871663285,LastTimestamp:2026-03-08 00:19:31.160063763 +0000 UTC m=+14.871663285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.177867 master-0 kubenswrapper[4059]: E0308 00:19:31.177718 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5b013b85f53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 2.781s (2.781s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.173433171 +0000 UTC m=+14.885032723,LastTimestamp:2026-03-08 00:19:31.173433171 +0000 UTC m=+14.885032723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.346236 master-0 kubenswrapper[4059]: E0308 00:19:31.346019 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5b01dc0997d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.341744509 +0000 UTC m=+15.053344031,LastTimestamp:2026-03-08 00:19:31.341744509 +0000 UTC m=+15.053344031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.351664 master-0 kubenswrapper[4059]: E0308 00:19:31.351556 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5b01e00b3ec kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.34594558 +0000 UTC m=+15.057545102,LastTimestamp:2026-03-08 00:19:31.34594558 +0000 UTC m=+15.057545102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.359243 master-0 kubenswrapper[4059]: E0308 00:19:31.358978 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ab5b01e7d747a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.354121338 +0000 UTC m=+15.065720860,LastTimestamp:2026-03-08 00:19:31.354121338 +0000 UTC m=+15.065720860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.364059 master-0 kubenswrapper[4059]: E0308 00:19:31.363923 4059 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ab5b01e9b6a46 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:31.356084806 +0000 UTC m=+15.067684328,LastTimestamp:2026-03-08 00:19:31.356084806 +0000 UTC m=+15.067684328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:31.514038 master-0 kubenswrapper[4059]: W0308 00:19:31.513984 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 00:19:31.514289 master-0 kubenswrapper[4059]: E0308 00:19:31.514056 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 00:19:31.991699 master-0 kubenswrapper[4059]: I0308 00:19:31.991609 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:32.090468 master-0 kubenswrapper[4059]: I0308 00:19:32.090347 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:19:32.109412 master-0 kubenswrapper[4059]: I0308 00:19:32.109346 4059 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:19:32.206490 master-0 kubenswrapper[4059]: I0308 00:19:32.206403 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0"} Mar 08 00:19:32.206490 master-0 kubenswrapper[4059]: I0308 00:19:32.206462 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:32.207636 master-0 kubenswrapper[4059]: I0308 00:19:32.207605 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:32.207636 master-0 kubenswrapper[4059]: I0308 00:19:32.207636 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:32.207853 master-0 kubenswrapper[4059]: I0308 00:19:32.207646 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:32.210072 master-0 kubenswrapper[4059]: I0308 00:19:32.210021 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:32.210176 master-0 kubenswrapper[4059]: I0308 00:19:32.209996 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc"} Mar 08 00:19:32.210757 master-0 kubenswrapper[4059]: I0308 00:19:32.210716 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:32.210757 master-0 kubenswrapper[4059]: I0308 00:19:32.210741 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:32.210757 master-0 kubenswrapper[4059]: I0308 00:19:32.210754 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:32.993391 master-0 kubenswrapper[4059]: I0308 00:19:32.993317 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:33.212059 master-0 kubenswrapper[4059]: I0308 00:19:33.212033 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:33.213043 master-0 kubenswrapper[4059]: I0308 00:19:33.213028 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:33.213801 master-0 kubenswrapper[4059]: I0308 00:19:33.213770 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:33.213849 master-0 kubenswrapper[4059]: I0308 00:19:33.213814 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:33.213849 master-0 kubenswrapper[4059]: I0308 00:19:33.213826 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:33.213963 master-0 kubenswrapper[4059]: I0308 00:19:33.213948 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:33.214029 master-0 kubenswrapper[4059]: I0308 00:19:33.214020 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:33.214093 master-0 kubenswrapper[4059]: I0308 00:19:33.214083 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:33.449654 master-0 kubenswrapper[4059]: I0308 00:19:33.449601 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:33.562794 master-0 kubenswrapper[4059]: I0308 00:19:33.562732 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:33.843605 master-0 kubenswrapper[4059]: I0308 00:19:33.843490 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:33.994060 master-0 kubenswrapper[4059]: I0308 00:19:33.993996 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:34.215416 master-0 kubenswrapper[4059]: I0308 00:19:34.215187 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:34.215416 master-0 kubenswrapper[4059]: I0308 00:19:34.215238 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:34.216730 master-0 kubenswrapper[4059]: I0308 00:19:34.216674 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:34.216730 master-0 kubenswrapper[4059]: I0308 00:19:34.216721 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:34.216730 master-0 kubenswrapper[4059]: I0308 00:19:34.216742 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:34.217010 master-0 kubenswrapper[4059]: I0308 00:19:34.216757 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:34.217010 master-0 kubenswrapper[4059]: I0308 00:19:34.216766 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:34.217010 master-0 kubenswrapper[4059]: I0308 00:19:34.216780 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:34.671094 master-0 kubenswrapper[4059]: W0308 00:19:34.670983 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:34.671472 master-0 kubenswrapper[4059]: E0308 00:19:34.671095 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 00:19:34.991433 master-0 kubenswrapper[4059]: I0308 00:19:34.991223 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:35.217685 master-0 kubenswrapper[4059]: I0308 00:19:35.217620 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:35.218511 master-0 kubenswrapper[4059]: I0308 00:19:35.218479 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:35.218566 master-0 kubenswrapper[4059]: I0308 00:19:35.218518 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:35.218566 master-0 kubenswrapper[4059]: I0308 00:19:35.218531 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:35.420011 master-0 kubenswrapper[4059]: W0308 00:19:35.419953 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 00:19:35.420011 master-0 kubenswrapper[4059]: E0308 00:19:35.420020 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 00:19:35.577570 master-0 kubenswrapper[4059]: I0308 00:19:35.577459 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:35.581474 master-0 kubenswrapper[4059]: I0308 00:19:35.581435 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:35.990465 master-0 kubenswrapper[4059]: I0308 00:19:35.990351 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:36.219429 master-0 kubenswrapper[4059]: I0308 00:19:36.219326 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:36.220298 master-0 kubenswrapper[4059]: I0308 00:19:36.220276 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:36.220376 master-0 kubenswrapper[4059]: I0308 00:19:36.220310 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:36.220376 master-0 kubenswrapper[4059]: I0308 00:19:36.220318 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:36.786573 master-0 kubenswrapper[4059]: E0308 00:19:36.786521 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 00:19:36.815338 master-0 kubenswrapper[4059]: I0308 00:19:36.813745 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:36.815338 master-0 kubenswrapper[4059]: I0308 00:19:36.813947 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:36.815760 master-0 kubenswrapper[4059]: I0308 00:19:36.815740 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:36.815760 master-0 kubenswrapper[4059]: I0308 00:19:36.815764 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:36.815917 master-0 kubenswrapper[4059]: I0308 00:19:36.815775 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:36.822033 master-0 kubenswrapper[4059]: I0308 00:19:36.821968 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:36.830695 master-0 kubenswrapper[4059]: I0308 00:19:36.829974 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:36.831624 master-0 kubenswrapper[4059]: I0308 00:19:36.831535 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:36.831920 master-0 kubenswrapper[4059]: I0308 00:19:36.831581 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:36.832631 master-0 kubenswrapper[4059]: I0308 00:19:36.831866 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:36.832631 master-0 kubenswrapper[4059]: I0308 00:19:36.832181 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:36.838129 master-0 kubenswrapper[4059]: E0308 00:19:36.838089 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 00:19:36.991155 master-0 kubenswrapper[4059]: I0308 00:19:36.991047 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:37.107009 master-0 kubenswrapper[4059]: E0308 00:19:37.106890 4059 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 00:19:37.146576 master-0 kubenswrapper[4059]: I0308 00:19:37.146510 4059 csr.go:261] certificate signing request csr-s42bz is approved, waiting to be issued Mar 08 00:19:37.221110 master-0 kubenswrapper[4059]: I0308 00:19:37.221040 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:37.221577 master-0 kubenswrapper[4059]: I0308 00:19:37.221131 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:37.222362 master-0 kubenswrapper[4059]: I0308 00:19:37.222312 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:37.222362 master-0 kubenswrapper[4059]: I0308 00:19:37.222349 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:37.222439 master-0 kubenswrapper[4059]: I0308 00:19:37.222375 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:37.222439 master-0 kubenswrapper[4059]: I0308 00:19:37.222388 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:37.222439 master-0 kubenswrapper[4059]: I0308 00:19:37.222375 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:37.222522 master-0 kubenswrapper[4059]: I0308 00:19:37.222445 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:37.226616 master-0 kubenswrapper[4059]: I0308 00:19:37.226575 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:19:37.991869 master-0 kubenswrapper[4059]: I0308 00:19:37.991826 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:38.223751 master-0 kubenswrapper[4059]: I0308 00:19:38.223723 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:38.225238 master-0 kubenswrapper[4059]: I0308 00:19:38.225184 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:38.225324 master-0 kubenswrapper[4059]: I0308 00:19:38.225260 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:38.225324 master-0 kubenswrapper[4059]: I0308 00:19:38.225278 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:38.769706 master-0 kubenswrapper[4059]: W0308 00:19:38.769610 4059 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 00:19:38.769950 master-0 kubenswrapper[4059]: E0308 00:19:38.769715 4059 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 00:19:38.990141 master-0 kubenswrapper[4059]: I0308 00:19:38.990072 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:39.991440 master-0 kubenswrapper[4059]: I0308 00:19:39.991368 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:40.134546 master-0 kubenswrapper[4059]: I0308 00:19:40.134474 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:40.135635 master-0 kubenswrapper[4059]: I0308 00:19:40.135590 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:40.135885 master-0 kubenswrapper[4059]: I0308 00:19:40.135860 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:40.136041 master-0 kubenswrapper[4059]: I0308 00:19:40.136019 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:40.136914 master-0 kubenswrapper[4059]: I0308 00:19:40.136881 4059 scope.go:117] "RemoveContainer" containerID="e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d" Mar 08 00:19:40.147090 master-0 kubenswrapper[4059]: E0308 00:19:40.146790 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae06a93a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae06a93a93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:22.364402323 +0000 UTC m=+6.076001845,LastTimestamp:2026-03-08 00:19:40.141504126 +0000 UTC m=+23.853103678,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:40.387534 master-0 kubenswrapper[4059]: E0308 00:19:40.387113 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae406214ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae406214ba openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.332818106 +0000 UTC m=+7.044417618,LastTimestamp:2026-03-08 00:19:40.380392255 +0000 UTC m=+24.091991777,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:40.396033 master-0 kubenswrapper[4059]: E0308 00:19:40.395911 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5ae43c80992\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5ae43c80992 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:23.38983157 +0000 UTC m=+7.101431092,LastTimestamp:2026-03-08 00:19:40.389221119 +0000 UTC m=+24.100820641,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:40.993437 master-0 kubenswrapper[4059]: I0308 00:19:40.993352 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:41.233379 master-0 kubenswrapper[4059]: I0308 00:19:41.233332 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 00:19:41.234059 master-0 kubenswrapper[4059]: I0308 00:19:41.234025 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 00:19:41.235437 master-0 kubenswrapper[4059]: I0308 00:19:41.235140 4059 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" exitCode=1 Mar 08 00:19:41.235437 master-0 kubenswrapper[4059]: I0308 00:19:41.235198 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47"} Mar 08 00:19:41.235437 master-0 kubenswrapper[4059]: I0308 00:19:41.235284 4059 scope.go:117] "RemoveContainer" containerID="e5f3d72ec10226c7ab1167503198c66d4a22d49dd4bc12c569f0612c2ff69e2d" Mar 08 00:19:41.235437 master-0 kubenswrapper[4059]: I0308 00:19:41.235405 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:41.237037 master-0 kubenswrapper[4059]: I0308 00:19:41.237015 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:41.237125 master-0 kubenswrapper[4059]: I0308 00:19:41.237054 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:41.237125 master-0 kubenswrapper[4059]: I0308 00:19:41.237076 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:41.237704 master-0 kubenswrapper[4059]: I0308 00:19:41.237628 4059 scope.go:117] "RemoveContainer" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" Mar 08 00:19:41.238074 master-0 kubenswrapper[4059]: E0308 00:19:41.238015 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 00:19:41.245193 master-0 kubenswrapper[4059]: E0308 00:19:41.245043 4059 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ab5af61dc2c4f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ab5af61dc2c4f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:19:28.189434959 +0000 UTC m=+11.901034501,LastTimestamp:2026-03-08 00:19:41.237956009 +0000 UTC m=+24.949555571,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:19:41.496498 master-0 kubenswrapper[4059]: I0308 00:19:41.496243 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:41.496498 master-0 kubenswrapper[4059]: I0308 00:19:41.496504 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:41.497877 master-0 kubenswrapper[4059]: I0308 00:19:41.497823 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:41.497956 master-0 kubenswrapper[4059]: I0308 00:19:41.497892 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:41.497956 master-0 kubenswrapper[4059]: I0308 00:19:41.497913 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:41.502154 master-0 kubenswrapper[4059]: I0308 00:19:41.501953 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:41.502154 master-0 kubenswrapper[4059]: I0308 00:19:41.502112 4059 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:41.990825 master-0 kubenswrapper[4059]: I0308 00:19:41.990717 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:42.239866 master-0 kubenswrapper[4059]: I0308 00:19:42.239789 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 00:19:42.240567 master-0 kubenswrapper[4059]: I0308 00:19:42.240386 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:42.241165 master-0 kubenswrapper[4059]: I0308 00:19:42.241072 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:42.241165 master-0 kubenswrapper[4059]: I0308 00:19:42.241103 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:42.241165 master-0 kubenswrapper[4059]: I0308 00:19:42.241115 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:42.245914 master-0 kubenswrapper[4059]: I0308 00:19:42.245860 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:19:42.994300 master-0 kubenswrapper[4059]: I0308 00:19:42.994140 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:43.242786 master-0 kubenswrapper[4059]: I0308 00:19:43.242708 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:43.244018 master-0 kubenswrapper[4059]: I0308 00:19:43.243887 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:43.244018 master-0 kubenswrapper[4059]: I0308 00:19:43.243941 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:43.244018 master-0 kubenswrapper[4059]: I0308 00:19:43.243965 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:43.795151 master-0 kubenswrapper[4059]: E0308 00:19:43.795031 4059 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 00:19:43.839260 master-0 kubenswrapper[4059]: I0308 00:19:43.839138 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:43.840805 master-0 kubenswrapper[4059]: I0308 00:19:43.840731 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:43.840805 master-0 kubenswrapper[4059]: I0308 00:19:43.840811 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:43.841062 master-0 kubenswrapper[4059]: I0308 00:19:43.840834 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:43.841062 master-0 kubenswrapper[4059]: I0308 00:19:43.840949 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:43.847267 master-0 kubenswrapper[4059]: E0308 00:19:43.847091 4059 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 00:19:43.992497 master-0 kubenswrapper[4059]: I0308 00:19:43.992412 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:44.245417 master-0 kubenswrapper[4059]: I0308 00:19:44.245378 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:44.247650 master-0 kubenswrapper[4059]: I0308 00:19:44.247608 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:44.247746 master-0 kubenswrapper[4059]: I0308 00:19:44.247658 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:44.247746 master-0 kubenswrapper[4059]: I0308 00:19:44.247674 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:44.992176 master-0 kubenswrapper[4059]: I0308 00:19:44.992103 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:45.994557 master-0 kubenswrapper[4059]: I0308 00:19:45.994512 4059 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:19:46.819183 master-0 kubenswrapper[4059]: I0308 00:19:46.819136 4059 csr.go:257] certificate signing request csr-s42bz is issued Mar 08 00:19:46.995176 master-0 kubenswrapper[4059]: I0308 00:19:46.995121 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.010807 master-0 kubenswrapper[4059]: I0308 00:19:47.010746 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.068969 master-0 kubenswrapper[4059]: I0308 00:19:47.068913 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.107446 master-0 kubenswrapper[4059]: E0308 00:19:47.107343 4059 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 00:19:47.336214 master-0 kubenswrapper[4059]: I0308 00:19:47.336154 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.336214 master-0 kubenswrapper[4059]: E0308 00:19:47.336192 4059 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 00:19:47.355710 master-0 kubenswrapper[4059]: I0308 00:19:47.355674 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.371350 master-0 kubenswrapper[4059]: I0308 00:19:47.371260 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.430516 master-0 kubenswrapper[4059]: I0308 00:19:47.430459 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.694220 master-0 kubenswrapper[4059]: I0308 00:19:47.694096 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.694220 master-0 kubenswrapper[4059]: E0308 00:19:47.694137 4059 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 00:19:47.712353 master-0 kubenswrapper[4059]: I0308 00:19:47.712261 4059 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 00:19:47.803173 master-0 kubenswrapper[4059]: I0308 00:19:47.803118 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.817928 master-0 kubenswrapper[4059]: I0308 00:19:47.817899 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:47.821081 master-0 kubenswrapper[4059]: I0308 00:19:47.821028 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 18:38:06.211422704 +0000 UTC Mar 08 00:19:47.821081 master-0 kubenswrapper[4059]: I0308 00:19:47.821064 4059 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h18m18.390361641s for next certificate rotation Mar 08 00:19:47.875345 master-0 kubenswrapper[4059]: I0308 00:19:47.875315 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:48.145067 master-0 kubenswrapper[4059]: I0308 00:19:48.145016 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:48.145067 master-0 kubenswrapper[4059]: E0308 00:19:48.145046 4059 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 00:19:48.717360 master-0 kubenswrapper[4059]: I0308 00:19:48.717310 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:48.731840 master-0 kubenswrapper[4059]: I0308 00:19:48.731808 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:48.791925 master-0 kubenswrapper[4059]: I0308 00:19:48.791882 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:49.071330 master-0 kubenswrapper[4059]: I0308 00:19:49.071292 4059 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 00:19:49.071588 master-0 kubenswrapper[4059]: E0308 00:19:49.071569 4059 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 00:19:50.802587 master-0 kubenswrapper[4059]: E0308 00:19:50.802511 4059 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 08 00:19:50.848006 master-0 kubenswrapper[4059]: I0308 00:19:50.847907 4059 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:19:50.849514 master-0 kubenswrapper[4059]: I0308 00:19:50.849451 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:19:50.849514 master-0 kubenswrapper[4059]: I0308 00:19:50.849502 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:19:50.849514 master-0 kubenswrapper[4059]: I0308 00:19:50.849519 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:19:50.849792 master-0 kubenswrapper[4059]: I0308 00:19:50.849594 4059 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:19:50.862841 master-0 kubenswrapper[4059]: I0308 00:19:50.862770 4059 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 00:19:50.862841 master-0 kubenswrapper[4059]: E0308 00:19:50.862828 4059 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 08 00:19:50.876036 master-0 kubenswrapper[4059]: E0308 00:19:50.875949 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:50.977023 master-0 kubenswrapper[4059]: E0308 00:19:50.976928 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.003702 master-0 kubenswrapper[4059]: I0308 00:19:51.003614 4059 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 00:19:51.018693 master-0 kubenswrapper[4059]: I0308 00:19:51.018610 4059 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:19:51.077878 master-0 kubenswrapper[4059]: E0308 00:19:51.077716 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.178892 master-0 kubenswrapper[4059]: E0308 00:19:51.178792 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.280070 master-0 kubenswrapper[4059]: E0308 00:19:51.279967 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.381064 master-0 kubenswrapper[4059]: E0308 00:19:51.380914 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.481917 master-0 kubenswrapper[4059]: E0308 00:19:51.481860 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.582812 master-0 kubenswrapper[4059]: E0308 00:19:51.582750 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.683687 master-0 kubenswrapper[4059]: E0308 00:19:51.683579 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.784065 master-0 kubenswrapper[4059]: E0308 00:19:51.783981 4059 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 00:19:51.827407 master-0 kubenswrapper[4059]: I0308 00:19:51.827353 4059 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:19:52.348193 master-0 kubenswrapper[4059]: I0308 00:19:52.347914 4059 apiserver.go:52] "Watching apiserver" Mar 08 00:19:52.352229 master-0 kubenswrapper[4059]: I0308 00:19:52.352098 4059 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:19:52.352344 master-0 kubenswrapper[4059]: I0308 00:19:52.352312 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7c649bf6d4-st2sr"] Mar 08 00:19:52.352561 master-0 kubenswrapper[4059]: I0308 00:19:52.352533 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.353562 master-0 kubenswrapper[4059]: I0308 00:19:52.353534 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:19:52.354587 master-0 kubenswrapper[4059]: I0308 00:19:52.354562 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:19:52.355768 master-0 kubenswrapper[4059]: I0308 00:19:52.355749 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:19:52.386328 master-0 kubenswrapper[4059]: I0308 00:19:52.386270 4059 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 00:19:52.451861 master-0 kubenswrapper[4059]: I0308 00:19:52.451816 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.451861 master-0 kubenswrapper[4059]: I0308 00:19:52.451864 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.451861 master-0 kubenswrapper[4059]: I0308 00:19:52.451882 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.552733 master-0 kubenswrapper[4059]: I0308 00:19:52.552677 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.552733 master-0 kubenswrapper[4059]: I0308 00:19:52.552729 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.552997 master-0 kubenswrapper[4059]: I0308 00:19:52.552755 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.552997 master-0 kubenswrapper[4059]: I0308 00:19:52.552834 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.554770 master-0 kubenswrapper[4059]: I0308 00:19:52.554734 4059 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 00:19:52.562488 master-0 kubenswrapper[4059]: I0308 00:19:52.562454 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.569583 master-0 kubenswrapper[4059]: I0308 00:19:52.569539 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.666182 master-0 kubenswrapper[4059]: I0308 00:19:52.666059 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:19:52.678086 master-0 kubenswrapper[4059]: W0308 00:19:52.678046 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2d22f2_c260_42a6_a9da_ee0f44f42303.slice/crio-b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b WatchSource:0}: Error finding container b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b: Status 404 returned error can't find the container with id b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b Mar 08 00:19:52.920746 master-0 kubenswrapper[4059]: I0308 00:19:52.920605 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq"] Mar 08 00:19:52.921701 master-0 kubenswrapper[4059]: I0308 00:19:52.921010 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:52.923749 master-0 kubenswrapper[4059]: I0308 00:19:52.923689 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:19:52.924316 master-0 kubenswrapper[4059]: I0308 00:19:52.924262 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:19:52.925698 master-0 kubenswrapper[4059]: I0308 00:19:52.925626 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:19:53.055171 master-0 kubenswrapper[4059]: I0308 00:19:53.055105 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.055377 master-0 kubenswrapper[4059]: I0308 00:19:53.055183 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.055377 master-0 kubenswrapper[4059]: I0308 00:19:53.055247 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.055377 master-0 kubenswrapper[4059]: I0308 00:19:53.055281 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.055377 master-0 kubenswrapper[4059]: I0308 00:19:53.055316 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.156547 master-0 kubenswrapper[4059]: I0308 00:19:53.156454 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.156547 master-0 kubenswrapper[4059]: I0308 00:19:53.156516 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.156547 master-0 kubenswrapper[4059]: I0308 00:19:53.156559 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.156911 master-0 kubenswrapper[4059]: I0308 00:19:53.156595 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.156911 master-0 kubenswrapper[4059]: I0308 00:19:53.156630 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.157494 master-0 kubenswrapper[4059]: I0308 00:19:53.157399 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.157588 master-0 kubenswrapper[4059]: I0308 00:19:53.157486 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.157654 master-0 kubenswrapper[4059]: E0308 00:19:53.157606 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:53.157825 master-0 kubenswrapper[4059]: E0308 00:19:53.157787 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:19:53.657670351 +0000 UTC m=+37.369269913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:53.159405 master-0 kubenswrapper[4059]: I0308 00:19:53.159354 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.185519 master-0 kubenswrapper[4059]: I0308 00:19:53.185432 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.272370 master-0 kubenswrapper[4059]: I0308 00:19:53.272268 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b"} Mar 08 00:19:53.636708 master-0 kubenswrapper[4059]: I0308 00:19:53.636654 4059 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:19:53.660296 master-0 kubenswrapper[4059]: I0308 00:19:53.659727 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:53.660296 master-0 kubenswrapper[4059]: E0308 00:19:53.659894 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:53.660296 master-0 kubenswrapper[4059]: E0308 00:19:53.659968 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:19:54.659944542 +0000 UTC m=+38.371544104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:54.666085 master-0 kubenswrapper[4059]: I0308 00:19:54.666039 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:54.666941 master-0 kubenswrapper[4059]: E0308 00:19:54.666169 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:54.666941 master-0 kubenswrapper[4059]: E0308 00:19:54.666240 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:19:56.666221507 +0000 UTC m=+40.377821029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:55.328664 master-0 kubenswrapper[4059]: I0308 00:19:55.328490 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-v949k"] Mar 08 00:19:55.328922 master-0 kubenswrapper[4059]: I0308 00:19:55.328754 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.332514 master-0 kubenswrapper[4059]: I0308 00:19:55.331730 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 08 00:19:55.332514 master-0 kubenswrapper[4059]: I0308 00:19:55.331773 4059 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 08 00:19:55.332514 master-0 kubenswrapper[4059]: I0308 00:19:55.331741 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 08 00:19:55.334468 master-0 kubenswrapper[4059]: I0308 00:19:55.334449 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 08 00:19:55.471618 master-0 kubenswrapper[4059]: I0308 00:19:55.471581 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.471823 master-0 kubenswrapper[4059]: I0308 00:19:55.471675 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.471823 master-0 kubenswrapper[4059]: I0308 00:19:55.471725 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.471823 master-0 kubenswrapper[4059]: I0308 00:19:55.471753 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.471823 master-0 kubenswrapper[4059]: I0308 00:19:55.471785 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f45pc\" (UniqueName: \"kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.572978 master-0 kubenswrapper[4059]: I0308 00:19:55.572943 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.572978 master-0 kubenswrapper[4059]: I0308 00:19:55.572982 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.572998 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.573044 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f45pc\" (UniqueName: \"kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.573063 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.573108 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.573140 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573170 master-0 kubenswrapper[4059]: I0308 00:19:55.573160 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.573352 master-0 kubenswrapper[4059]: I0308 00:19:55.573180 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.591429 master-0 kubenswrapper[4059]: I0308 00:19:55.591145 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f45pc\" (UniqueName: \"kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc\") pod \"assisted-installer-controller-v949k\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:55.658137 master-0 kubenswrapper[4059]: I0308 00:19:55.657645 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:19:56.033119 master-0 kubenswrapper[4059]: I0308 00:19:56.032916 4059 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:19:56.039072 master-0 kubenswrapper[4059]: W0308 00:19:56.039044 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4cab26a_fe31_4cf2_a938_b280f1934d99.slice/crio-48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e WatchSource:0}: Error finding container 48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e: Status 404 returned error can't find the container with id 48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e Mar 08 00:19:56.146831 master-0 kubenswrapper[4059]: I0308 00:19:56.146420 4059 scope.go:117] "RemoveContainer" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" Mar 08 00:19:56.146831 master-0 kubenswrapper[4059]: E0308 00:19:56.146590 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 00:19:56.146831 master-0 kubenswrapper[4059]: I0308 00:19:56.146715 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 08 00:19:56.276672 master-0 kubenswrapper[4059]: I0308 00:19:56.276635 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-v949k" event={"ID":"c4cab26a-fe31-4cf2-a938-b280f1934d99","Type":"ContainerStarted","Data":"48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e"} Mar 08 00:19:56.276975 master-0 kubenswrapper[4059]: I0308 00:19:56.276956 4059 scope.go:117] "RemoveContainer" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" Mar 08 00:19:56.277130 master-0 kubenswrapper[4059]: E0308 00:19:56.277103 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 00:19:56.681530 master-0 kubenswrapper[4059]: I0308 00:19:56.681369 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:19:56.681786 master-0 kubenswrapper[4059]: E0308 00:19:56.681537 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:56.681786 master-0 kubenswrapper[4059]: E0308 00:19:56.681651 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:20:00.681631598 +0000 UTC m=+44.393231120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:19:57.279994 master-0 kubenswrapper[4059]: I0308 00:19:57.279652 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"06038340b4e3f2befb44d9c767edb4dd565cb0800261ba9f5e36429d3a7bf10d"} Mar 08 00:19:57.469374 master-0 kubenswrapper[4059]: I0308 00:19:57.469030 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" podStartSLOduration=3.038664131 podStartE2EDuration="6.468991184s" podCreationTimestamp="2026-03-08 00:19:51 +0000 UTC" firstStartedPulling="2026-03-08 00:19:52.680391802 +0000 UTC m=+36.391991324" lastFinishedPulling="2026-03-08 00:19:56.110718855 +0000 UTC m=+39.822318377" observedRunningTime="2026-03-08 00:19:57.468619841 +0000 UTC m=+41.180219373" watchObservedRunningTime="2026-03-08 00:19:57.468991184 +0000 UTC m=+41.180590706" Mar 08 00:19:58.013465 master-0 kubenswrapper[4059]: I0308 00:19:58.013400 4059 csr.go:261] certificate signing request csr-9g744 is approved, waiting to be issued Mar 08 00:19:58.019624 master-0 kubenswrapper[4059]: I0308 00:19:58.019521 4059 csr.go:257] certificate signing request csr-9g744 is issued Mar 08 00:19:58.878356 master-0 kubenswrapper[4059]: I0308 00:19:58.878312 4059 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:19:59.021230 master-0 kubenswrapper[4059]: I0308 00:19:59.021134 4059 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 19:43:55.632923367 +0000 UTC Mar 08 00:19:59.021230 master-0 kubenswrapper[4059]: I0308 00:19:59.021165 4059 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h23m56.611760828s for next certificate rotation Mar 08 00:19:59.374931 master-0 kubenswrapper[4059]: I0308 00:19:59.374892 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-sbmgv"] Mar 08 00:19:59.375148 master-0 kubenswrapper[4059]: I0308 00:19:59.375103 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:19:59.514980 master-0 kubenswrapper[4059]: I0308 00:19:59.514909 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phw9l\" (UniqueName: \"kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l\") pod \"mtu-prober-sbmgv\" (UID: \"ecfff260-be5c-421c-9158-dfd8fa382e4a\") " pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:19:59.615560 master-0 kubenswrapper[4059]: I0308 00:19:59.615475 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phw9l\" (UniqueName: \"kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l\") pod \"mtu-prober-sbmgv\" (UID: \"ecfff260-be5c-421c-9158-dfd8fa382e4a\") " pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:19:59.641070 master-0 kubenswrapper[4059]: I0308 00:19:59.640955 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phw9l\" (UniqueName: \"kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l\") pod \"mtu-prober-sbmgv\" (UID: \"ecfff260-be5c-421c-9158-dfd8fa382e4a\") " pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:19:59.690471 master-0 kubenswrapper[4059]: I0308 00:19:59.690399 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:20:00.735874 master-0 kubenswrapper[4059]: I0308 00:20:00.735789 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:20:00.737012 master-0 kubenswrapper[4059]: E0308 00:20:00.735905 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:00.737012 master-0 kubenswrapper[4059]: E0308 00:20:00.735989 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:20:08.735973814 +0000 UTC m=+52.447573336 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:00.795818 master-0 kubenswrapper[4059]: W0308 00:20:00.795767 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecfff260_be5c_421c_9158_dfd8fa382e4a.slice/crio-858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57 WatchSource:0}: Error finding container 858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57: Status 404 returned error can't find the container with id 858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57 Mar 08 00:20:01.292254 master-0 kubenswrapper[4059]: I0308 00:20:01.291998 4059 generic.go:334] "Generic (PLEG): container finished" podID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerID="d6af0d3578bc6ae0d4e0f5d4dbddc52dc70217cef15e030aab47b2704363ffe2" exitCode=0 Mar 08 00:20:01.292254 master-0 kubenswrapper[4059]: I0308 00:20:01.292093 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-v949k" event={"ID":"c4cab26a-fe31-4cf2-a938-b280f1934d99","Type":"ContainerDied","Data":"d6af0d3578bc6ae0d4e0f5d4dbddc52dc70217cef15e030aab47b2704363ffe2"} Mar 08 00:20:01.294735 master-0 kubenswrapper[4059]: I0308 00:20:01.294644 4059 generic.go:334] "Generic (PLEG): container finished" podID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerID="79807bacb8255c5e003178362fd0a6e9b3e5481074aa31458cc27f40ce6114ac" exitCode=0 Mar 08 00:20:01.294735 master-0 kubenswrapper[4059]: I0308 00:20:01.294714 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-sbmgv" event={"ID":"ecfff260-be5c-421c-9158-dfd8fa382e4a","Type":"ContainerDied","Data":"79807bacb8255c5e003178362fd0a6e9b3e5481074aa31458cc27f40ce6114ac"} Mar 08 00:20:01.294952 master-0 kubenswrapper[4059]: I0308 00:20:01.294761 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-sbmgv" event={"ID":"ecfff260-be5c-421c-9158-dfd8fa382e4a","Type":"ContainerStarted","Data":"858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57"} Mar 08 00:20:02.328008 master-0 kubenswrapper[4059]: I0308 00:20:02.327030 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:20:02.332899 master-0 kubenswrapper[4059]: I0308 00:20:02.332851 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:20:02.449716 master-0 kubenswrapper[4059]: I0308 00:20:02.449632 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf\") pod \"c4cab26a-fe31-4cf2-a938-b280f1934d99\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " Mar 08 00:20:02.449895 master-0 kubenswrapper[4059]: I0308 00:20:02.449745 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f45pc\" (UniqueName: \"kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc\") pod \"c4cab26a-fe31-4cf2-a938-b280f1934d99\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " Mar 08 00:20:02.449895 master-0 kubenswrapper[4059]: I0308 00:20:02.449807 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle\") pod \"c4cab26a-fe31-4cf2-a938-b280f1934d99\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " Mar 08 00:20:02.449895 master-0 kubenswrapper[4059]: I0308 00:20:02.449866 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phw9l\" (UniqueName: \"kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l\") pod \"ecfff260-be5c-421c-9158-dfd8fa382e4a\" (UID: \"ecfff260-be5c-421c-9158-dfd8fa382e4a\") " Mar 08 00:20:02.449991 master-0 kubenswrapper[4059]: I0308 00:20:02.449790 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "c4cab26a-fe31-4cf2-a938-b280f1934d99" (UID: "c4cab26a-fe31-4cf2-a938-b280f1934d99"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:02.449991 master-0 kubenswrapper[4059]: I0308 00:20:02.449919 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf\") pod \"c4cab26a-fe31-4cf2-a938-b280f1934d99\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " Mar 08 00:20:02.449991 master-0 kubenswrapper[4059]: I0308 00:20:02.449966 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files\") pod \"c4cab26a-fe31-4cf2-a938-b280f1934d99\" (UID: \"c4cab26a-fe31-4cf2-a938-b280f1934d99\") " Mar 08 00:20:02.450066 master-0 kubenswrapper[4059]: I0308 00:20:02.449983 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "c4cab26a-fe31-4cf2-a938-b280f1934d99" (UID: "c4cab26a-fe31-4cf2-a938-b280f1934d99"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:02.450095 master-0 kubenswrapper[4059]: I0308 00:20:02.450053 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "c4cab26a-fe31-4cf2-a938-b280f1934d99" (UID: "c4cab26a-fe31-4cf2-a938-b280f1934d99"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:02.450128 master-0 kubenswrapper[4059]: I0308 00:20:02.450104 4059 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:02.450164 master-0 kubenswrapper[4059]: I0308 00:20:02.450068 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "c4cab26a-fe31-4cf2-a938-b280f1934d99" (UID: "c4cab26a-fe31-4cf2-a938-b280f1934d99"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:02.450223 master-0 kubenswrapper[4059]: I0308 00:20:02.450133 4059 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:02.455054 master-0 kubenswrapper[4059]: I0308 00:20:02.454862 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc" (OuterVolumeSpecName: "kube-api-access-f45pc") pod "c4cab26a-fe31-4cf2-a938-b280f1934d99" (UID: "c4cab26a-fe31-4cf2-a938-b280f1934d99"). InnerVolumeSpecName "kube-api-access-f45pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:02.455879 master-0 kubenswrapper[4059]: I0308 00:20:02.455807 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l" (OuterVolumeSpecName: "kube-api-access-phw9l") pod "ecfff260-be5c-421c-9158-dfd8fa382e4a" (UID: "ecfff260-be5c-421c-9158-dfd8fa382e4a"). InnerVolumeSpecName "kube-api-access-phw9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:02.551069 master-0 kubenswrapper[4059]: I0308 00:20:02.550875 4059 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:02.551069 master-0 kubenswrapper[4059]: I0308 00:20:02.550919 4059 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phw9l\" (UniqueName: \"kubernetes.io/projected/ecfff260-be5c-421c-9158-dfd8fa382e4a-kube-api-access-phw9l\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:02.551069 master-0 kubenswrapper[4059]: I0308 00:20:02.550936 4059 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c4cab26a-fe31-4cf2-a938-b280f1934d99-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:02.551069 master-0 kubenswrapper[4059]: I0308 00:20:02.550948 4059 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f45pc\" (UniqueName: \"kubernetes.io/projected/c4cab26a-fe31-4cf2-a938-b280f1934d99-kube-api-access-f45pc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:03.301624 master-0 kubenswrapper[4059]: I0308 00:20:03.301399 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-v949k" event={"ID":"c4cab26a-fe31-4cf2-a938-b280f1934d99","Type":"ContainerDied","Data":"48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e"} Mar 08 00:20:03.301624 master-0 kubenswrapper[4059]: I0308 00:20:03.301480 4059 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e" Mar 08 00:20:03.301624 master-0 kubenswrapper[4059]: I0308 00:20:03.301526 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:20:03.304050 master-0 kubenswrapper[4059]: I0308 00:20:03.303954 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-sbmgv" event={"ID":"ecfff260-be5c-421c-9158-dfd8fa382e4a","Type":"ContainerDied","Data":"858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57"} Mar 08 00:20:03.304050 master-0 kubenswrapper[4059]: I0308 00:20:03.304003 4059 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57" Mar 08 00:20:03.304050 master-0 kubenswrapper[4059]: I0308 00:20:03.304022 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-sbmgv" Mar 08 00:20:04.375287 master-0 kubenswrapper[4059]: I0308 00:20:04.375181 4059 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-sbmgv"] Mar 08 00:20:04.380425 master-0 kubenswrapper[4059]: I0308 00:20:04.380373 4059 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-sbmgv"] Mar 08 00:20:05.138450 master-0 kubenswrapper[4059]: I0308 00:20:05.138384 4059 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" path="/var/lib/kubelet/pods/ecfff260-be5c-421c-9158-dfd8fa382e4a/volumes" Mar 08 00:20:07.134487 master-0 kubenswrapper[4059]: I0308 00:20:07.134444 4059 scope.go:117] "RemoveContainer" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" Mar 08 00:20:08.318289 master-0 kubenswrapper[4059]: I0308 00:20:08.318226 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 00:20:08.318950 master-0 kubenswrapper[4059]: I0308 00:20:08.318563 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"aa5ad4a36fb34e3b8448dce44870bd90294e9dfdbc77705a2449657049d35017"} Mar 08 00:20:08.331500 master-0 kubenswrapper[4059]: I0308 00:20:08.331419 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=12.331398736 podStartE2EDuration="12.331398736s" podCreationTimestamp="2026-03-08 00:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:20:08.331369156 +0000 UTC m=+52.042968678" watchObservedRunningTime="2026-03-08 00:20:08.331398736 +0000 UTC m=+52.042998268" Mar 08 00:20:08.800190 master-0 kubenswrapper[4059]: I0308 00:20:08.800098 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:20:08.800507 master-0 kubenswrapper[4059]: E0308 00:20:08.800255 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:08.800507 master-0 kubenswrapper[4059]: E0308 00:20:08.800324 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:20:24.80030492 +0000 UTC m=+68.511904442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254179 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dllkj"] Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: E0308 00:20:09.254280 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254295 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: E0308 00:20:09.254315 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254324 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254349 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254357 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.254533 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.256785 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.258250 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.258574 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:20:09.258802 master-0 kubenswrapper[4059]: I0308 00:20:09.258639 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:20:09.403957 master-0 kubenswrapper[4059]: I0308 00:20:09.403899 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.403957 master-0 kubenswrapper[4059]: I0308 00:20:09.403943 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.403957 master-0 kubenswrapper[4059]: I0308 00:20:09.403963 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.403957 master-0 kubenswrapper[4059]: I0308 00:20:09.403980 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.403997 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404013 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404055 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404082 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404104 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404154 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404264 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404328 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404390 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404425 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404491 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404542 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.404964 master-0 kubenswrapper[4059]: I0308 00:20:09.404583 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.434015 master-0 kubenswrapper[4059]: I0308 00:20:09.433950 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-d5jxb"] Mar 08 00:20:09.434642 master-0 kubenswrapper[4059]: I0308 00:20:09.434610 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.436994 master-0 kubenswrapper[4059]: I0308 00:20:09.436944 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:20:09.437627 master-0 kubenswrapper[4059]: I0308 00:20:09.437587 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 00:20:09.505869 master-0 kubenswrapper[4059]: I0308 00:20:09.505806 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506124 master-0 kubenswrapper[4059]: I0308 00:20:09.506063 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506234 master-0 kubenswrapper[4059]: I0308 00:20:09.506122 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506234 master-0 kubenswrapper[4059]: I0308 00:20:09.506160 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506234 master-0 kubenswrapper[4059]: I0308 00:20:09.506190 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506447 master-0 kubenswrapper[4059]: I0308 00:20:09.506325 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506447 master-0 kubenswrapper[4059]: I0308 00:20:09.506409 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506554 master-0 kubenswrapper[4059]: I0308 00:20:09.506453 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506554 master-0 kubenswrapper[4059]: I0308 00:20:09.506487 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506554 master-0 kubenswrapper[4059]: I0308 00:20:09.506521 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506709 master-0 kubenswrapper[4059]: I0308 00:20:09.506552 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506709 master-0 kubenswrapper[4059]: I0308 00:20:09.506602 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506709 master-0 kubenswrapper[4059]: I0308 00:20:09.506627 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506871 master-0 kubenswrapper[4059]: I0308 00:20:09.506723 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506871 master-0 kubenswrapper[4059]: I0308 00:20:09.506766 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.506988 master-0 kubenswrapper[4059]: I0308 00:20:09.506927 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507050 master-0 kubenswrapper[4059]: I0308 00:20:09.506993 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507050 master-0 kubenswrapper[4059]: I0308 00:20:09.507023 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507159 master-0 kubenswrapper[4059]: I0308 00:20:09.507054 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507159 master-0 kubenswrapper[4059]: I0308 00:20:09.507089 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507159 master-0 kubenswrapper[4059]: I0308 00:20:09.507123 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507159 master-0 kubenswrapper[4059]: I0308 00:20:09.507151 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507420 master-0 kubenswrapper[4059]: I0308 00:20:09.507182 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507420 master-0 kubenswrapper[4059]: I0308 00:20:09.507281 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507420 master-0 kubenswrapper[4059]: I0308 00:20:09.507354 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507579 master-0 kubenswrapper[4059]: I0308 00:20:09.507455 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507579 master-0 kubenswrapper[4059]: I0308 00:20:09.507500 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507579 master-0 kubenswrapper[4059]: I0308 00:20:09.507572 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507929 master-0 kubenswrapper[4059]: I0308 00:20:09.507640 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507929 master-0 kubenswrapper[4059]: I0308 00:20:09.507682 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507929 master-0 kubenswrapper[4059]: I0308 00:20:09.507722 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.507929 master-0 kubenswrapper[4059]: I0308 00:20:09.507823 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.508364 master-0 kubenswrapper[4059]: I0308 00:20:09.507976 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.529175 master-0 kubenswrapper[4059]: I0308 00:20:09.528945 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.571644 master-0 kubenswrapper[4059]: I0308 00:20:09.571558 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dllkj" Mar 08 00:20:09.582618 master-0 kubenswrapper[4059]: W0308 00:20:09.582582 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da68e85_9170_499d_8050_139ecfac4600.slice/crio-99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f WatchSource:0}: Error finding container 99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f: Status 404 returned error can't find the container with id 99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f Mar 08 00:20:09.608667 master-0 kubenswrapper[4059]: I0308 00:20:09.608593 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.608667 master-0 kubenswrapper[4059]: I0308 00:20:09.608658 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.608855 master-0 kubenswrapper[4059]: I0308 00:20:09.608695 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.608855 master-0 kubenswrapper[4059]: I0308 00:20:09.608732 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.608855 master-0 kubenswrapper[4059]: I0308 00:20:09.608766 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.608855 master-0 kubenswrapper[4059]: I0308 00:20:09.608801 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.609004 master-0 kubenswrapper[4059]: I0308 00:20:09.608886 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.609004 master-0 kubenswrapper[4059]: I0308 00:20:09.608942 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710051 master-0 kubenswrapper[4059]: I0308 00:20:09.709981 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710261 master-0 kubenswrapper[4059]: I0308 00:20:09.710227 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710318 master-0 kubenswrapper[4059]: I0308 00:20:09.710288 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710360 master-0 kubenswrapper[4059]: I0308 00:20:09.710337 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710452 master-0 kubenswrapper[4059]: I0308 00:20:09.710421 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710592 master-0 kubenswrapper[4059]: I0308 00:20:09.710558 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710634 master-0 kubenswrapper[4059]: I0308 00:20:09.710614 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710667 master-0 kubenswrapper[4059]: I0308 00:20:09.710649 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710698 master-0 kubenswrapper[4059]: I0308 00:20:09.710672 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.710698 master-0 kubenswrapper[4059]: I0308 00:20:09.710683 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.711104 master-0 kubenswrapper[4059]: I0308 00:20:09.711079 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.711136 master-0 kubenswrapper[4059]: I0308 00:20:09.711094 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.711443 master-0 kubenswrapper[4059]: I0308 00:20:09.711418 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.712011 master-0 kubenswrapper[4059]: I0308 00:20:09.711970 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.713316 master-0 kubenswrapper[4059]: I0308 00:20:09.713277 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.730906 master-0 kubenswrapper[4059]: I0308 00:20:09.730541 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.745951 master-0 kubenswrapper[4059]: I0308 00:20:09.745771 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:20:09.753842 master-0 kubenswrapper[4059]: W0308 00:20:09.753792 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad8b9ea_ba1c_4507_9b70_ce2da170d480.slice/crio-7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c WatchSource:0}: Error finding container 7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c: Status 404 returned error can't find the container with id 7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c Mar 08 00:20:10.237625 master-0 kubenswrapper[4059]: I0308 00:20:10.237534 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-krv7c"] Mar 08 00:20:10.238122 master-0 kubenswrapper[4059]: I0308 00:20:10.238077 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:10.238291 master-0 kubenswrapper[4059]: E0308 00:20:10.238177 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:10.324649 master-0 kubenswrapper[4059]: I0308 00:20:10.324598 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerStarted","Data":"7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c"} Mar 08 00:20:10.326043 master-0 kubenswrapper[4059]: I0308 00:20:10.326017 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dllkj" event={"ID":"7da68e85-9170-499d-8050-139ecfac4600","Type":"ContainerStarted","Data":"99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f"} Mar 08 00:20:10.416418 master-0 kubenswrapper[4059]: I0308 00:20:10.416316 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:10.416418 master-0 kubenswrapper[4059]: I0308 00:20:10.416429 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:10.517819 master-0 kubenswrapper[4059]: I0308 00:20:10.517665 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:10.517819 master-0 kubenswrapper[4059]: I0308 00:20:10.517790 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:10.518057 master-0 kubenswrapper[4059]: E0308 00:20:10.517956 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:10.518057 master-0 kubenswrapper[4059]: E0308 00:20:10.518028 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:11.018005404 +0000 UTC m=+54.729604966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:10.533981 master-0 kubenswrapper[4059]: I0308 00:20:10.533866 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:11.099181 master-0 kubenswrapper[4059]: I0308 00:20:11.020513 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:11.099181 master-0 kubenswrapper[4059]: E0308 00:20:11.020688 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:11.099181 master-0 kubenswrapper[4059]: E0308 00:20:11.020755 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:12.020734015 +0000 UTC m=+55.732333567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:12.106371 master-0 kubenswrapper[4059]: I0308 00:20:12.106304 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:12.106864 master-0 kubenswrapper[4059]: E0308 00:20:12.106445 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:12.106864 master-0 kubenswrapper[4059]: E0308 00:20:12.106507 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:14.106489403 +0000 UTC m=+57.818088925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:12.133618 master-0 kubenswrapper[4059]: I0308 00:20:12.133537 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:12.133820 master-0 kubenswrapper[4059]: E0308 00:20:12.133660 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:14.122692 master-0 kubenswrapper[4059]: I0308 00:20:14.122506 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:14.122692 master-0 kubenswrapper[4059]: E0308 00:20:14.122679 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:14.123407 master-0 kubenswrapper[4059]: E0308 00:20:14.122734 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:18.122719387 +0000 UTC m=+61.834318909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:14.134407 master-0 kubenswrapper[4059]: I0308 00:20:14.134274 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:14.134407 master-0 kubenswrapper[4059]: E0308 00:20:14.134399 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:16.134490 master-0 kubenswrapper[4059]: I0308 00:20:16.134453 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:16.135015 master-0 kubenswrapper[4059]: E0308 00:20:16.134561 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:16.341908 master-0 kubenswrapper[4059]: I0308 00:20:16.341853 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="f8e210245fcf5757a0858988b80936bb56e15ab6a7c3881f301f7f4cb8a8f550" exitCode=0 Mar 08 00:20:16.342073 master-0 kubenswrapper[4059]: I0308 00:20:16.341934 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"f8e210245fcf5757a0858988b80936bb56e15ab6a7c3881f301f7f4cb8a8f550"} Mar 08 00:20:18.134468 master-0 kubenswrapper[4059]: I0308 00:20:18.134414 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:18.135009 master-0 kubenswrapper[4059]: E0308 00:20:18.134597 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:18.147608 master-0 kubenswrapper[4059]: I0308 00:20:18.147557 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:18.147716 master-0 kubenswrapper[4059]: E0308 00:20:18.147690 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:18.147767 master-0 kubenswrapper[4059]: E0308 00:20:18.147757 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:26.147737031 +0000 UTC m=+69.859336553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:20.133448 master-0 kubenswrapper[4059]: I0308 00:20:20.133382 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:20.134038 master-0 kubenswrapper[4059]: E0308 00:20:20.133540 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:21.629292 master-0 kubenswrapper[4059]: I0308 00:20:21.629226 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2"] Mar 08 00:20:21.629731 master-0 kubenswrapper[4059]: I0308 00:20:21.629639 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.632586 master-0 kubenswrapper[4059]: I0308 00:20:21.632548 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:20:21.632759 master-0 kubenswrapper[4059]: I0308 00:20:21.632735 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:20:21.632759 master-0 kubenswrapper[4059]: I0308 00:20:21.632741 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:20:21.632921 master-0 kubenswrapper[4059]: I0308 00:20:21.632848 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:20:21.632960 master-0 kubenswrapper[4059]: I0308 00:20:21.632927 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:20:21.793693 master-0 kubenswrapper[4059]: I0308 00:20:21.793625 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.793879 master-0 kubenswrapper[4059]: I0308 00:20:21.793717 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.793879 master-0 kubenswrapper[4059]: I0308 00:20:21.793739 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.793879 master-0 kubenswrapper[4059]: I0308 00:20:21.793756 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894196 master-0 kubenswrapper[4059]: I0308 00:20:21.894064 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894196 master-0 kubenswrapper[4059]: I0308 00:20:21.894100 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894196 master-0 kubenswrapper[4059]: I0308 00:20:21.894119 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894196 master-0 kubenswrapper[4059]: I0308 00:20:21.894153 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894932 master-0 kubenswrapper[4059]: I0308 00:20:21.894849 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.894932 master-0 kubenswrapper[4059]: I0308 00:20:21.894886 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.898917 master-0 kubenswrapper[4059]: I0308 00:20:21.898824 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.910078 master-0 kubenswrapper[4059]: I0308 00:20:21.910010 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tf5qg"] Mar 08 00:20:21.910762 master-0 kubenswrapper[4059]: I0308 00:20:21.910728 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.912616 master-0 kubenswrapper[4059]: I0308 00:20:21.912572 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.912616 master-0 kubenswrapper[4059]: I0308 00:20:21.912595 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:20:21.914314 master-0 kubenswrapper[4059]: I0308 00:20:21.914275 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:20:21.943229 master-0 kubenswrapper[4059]: I0308 00:20:21.943163 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:20:21.995267 master-0 kubenswrapper[4059]: I0308 00:20:21.995178 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995267 master-0 kubenswrapper[4059]: I0308 00:20:21.995246 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995267 master-0 kubenswrapper[4059]: I0308 00:20:21.995272 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995535 master-0 kubenswrapper[4059]: I0308 00:20:21.995294 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995535 master-0 kubenswrapper[4059]: I0308 00:20:21.995362 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995607 master-0 kubenswrapper[4059]: I0308 00:20:21.995544 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995644 master-0 kubenswrapper[4059]: I0308 00:20:21.995606 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995644 master-0 kubenswrapper[4059]: I0308 00:20:21.995629 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995715 master-0 kubenswrapper[4059]: I0308 00:20:21.995652 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995715 master-0 kubenswrapper[4059]: I0308 00:20:21.995676 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995715 master-0 kubenswrapper[4059]: I0308 00:20:21.995700 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995814 master-0 kubenswrapper[4059]: I0308 00:20:21.995720 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995814 master-0 kubenswrapper[4059]: I0308 00:20:21.995740 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995814 master-0 kubenswrapper[4059]: I0308 00:20:21.995759 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995814 master-0 kubenswrapper[4059]: I0308 00:20:21.995779 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.995814 master-0 kubenswrapper[4059]: I0308 00:20:21.995810 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.996160 master-0 kubenswrapper[4059]: I0308 00:20:21.995830 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6n4s\" (UniqueName: \"kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.996160 master-0 kubenswrapper[4059]: I0308 00:20:21.995850 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.996160 master-0 kubenswrapper[4059]: I0308 00:20:21.995872 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:21.996160 master-0 kubenswrapper[4059]: I0308 00:20:21.995891 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096263 master-0 kubenswrapper[4059]: I0308 00:20:22.096182 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096291 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096332 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096360 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096386 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096406 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096425 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096358 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096459 master-0 kubenswrapper[4059]: I0308 00:20:22.096451 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096495 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096502 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096521 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096536 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096565 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096581 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096597 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096612 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096630 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096644 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096662 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096678 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.096686 master-0 kubenswrapper[4059]: I0308 00:20:22.096693 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096718 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096732 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6n4s\" (UniqueName: \"kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096752 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096768 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096812 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096833 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.096853 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.097067 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.097106 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097242 master-0 kubenswrapper[4059]: I0308 00:20:22.097128 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097513 master-0 kubenswrapper[4059]: I0308 00:20:22.097369 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097513 master-0 kubenswrapper[4059]: I0308 00:20:22.097417 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097513 master-0 kubenswrapper[4059]: I0308 00:20:22.097443 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097513 master-0 kubenswrapper[4059]: I0308 00:20:22.097468 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097911 master-0 kubenswrapper[4059]: I0308 00:20:22.097879 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.097949 master-0 kubenswrapper[4059]: I0308 00:20:22.097924 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.103796 master-0 kubenswrapper[4059]: I0308 00:20:22.103759 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.111491 master-0 kubenswrapper[4059]: I0308 00:20:22.111455 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6n4s\" (UniqueName: \"kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s\") pod \"ovnkube-node-tf5qg\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:22.133759 master-0 kubenswrapper[4059]: I0308 00:20:22.133722 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:22.133882 master-0 kubenswrapper[4059]: E0308 00:20:22.133849 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:22.230717 master-0 kubenswrapper[4059]: I0308 00:20:22.230590 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:23.339076 master-0 kubenswrapper[4059]: W0308 00:20:23.339013 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17aa8235_749b_49da_9fcd_cb4bd948f0a5.slice/crio-754fcfaa5e2efe3eedf9b613fba862c812cd8de913348e84d01b6035a79a3ae6 WatchSource:0}: Error finding container 754fcfaa5e2efe3eedf9b613fba862c812cd8de913348e84d01b6035a79a3ae6: Status 404 returned error can't find the container with id 754fcfaa5e2efe3eedf9b613fba862c812cd8de913348e84d01b6035a79a3ae6 Mar 08 00:20:23.340257 master-0 kubenswrapper[4059]: W0308 00:20:23.340185 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fee96d7_75a7_46e4_9707_7bd292f10b84.slice/crio-96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae WatchSource:0}: Error finding container 96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae: Status 404 returned error can't find the container with id 96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae Mar 08 00:20:23.358477 master-0 kubenswrapper[4059]: I0308 00:20:23.358403 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"754fcfaa5e2efe3eedf9b613fba862c812cd8de913348e84d01b6035a79a3ae6"} Mar 08 00:20:23.359885 master-0 kubenswrapper[4059]: I0308 00:20:23.359832 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae"} Mar 08 00:20:24.134225 master-0 kubenswrapper[4059]: I0308 00:20:24.134144 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:24.134585 master-0 kubenswrapper[4059]: E0308 00:20:24.134280 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:24.364129 master-0 kubenswrapper[4059]: I0308 00:20:24.364083 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="48c6a8c71ab87bd002a24ce7589e179bd20778d506e7cd037500b0c5771c655a" exitCode=0 Mar 08 00:20:24.364732 master-0 kubenswrapper[4059]: I0308 00:20:24.364140 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"48c6a8c71ab87bd002a24ce7589e179bd20778d506e7cd037500b0c5771c655a"} Mar 08 00:20:24.365557 master-0 kubenswrapper[4059]: I0308 00:20:24.365238 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"53e5b4e15707abe8f63034abbcefc6b4a23fa99d2992c497080fcbc4818458e4"} Mar 08 00:20:24.367354 master-0 kubenswrapper[4059]: I0308 00:20:24.366771 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dllkj" event={"ID":"7da68e85-9170-499d-8050-139ecfac4600","Type":"ContainerStarted","Data":"8c4dfda663d3108e0d4d75d8ea37376292f3986c7575fe504d33fabc4e8a91ef"} Mar 08 00:20:24.399961 master-0 kubenswrapper[4059]: I0308 00:20:24.399776 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dllkj" podStartSLOduration=1.596763101 podStartE2EDuration="15.399757411s" podCreationTimestamp="2026-03-08 00:20:09 +0000 UTC" firstStartedPulling="2026-03-08 00:20:09.583867615 +0000 UTC m=+53.295467137" lastFinishedPulling="2026-03-08 00:20:23.386861915 +0000 UTC m=+67.098461447" observedRunningTime="2026-03-08 00:20:24.39935977 +0000 UTC m=+68.110959292" watchObservedRunningTime="2026-03-08 00:20:24.399757411 +0000 UTC m=+68.111356943" Mar 08 00:20:24.820698 master-0 kubenswrapper[4059]: I0308 00:20:24.820620 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:20:24.821163 master-0 kubenswrapper[4059]: E0308 00:20:24.820795 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:24.821163 master-0 kubenswrapper[4059]: E0308 00:20:24.820842 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:20:56.820828196 +0000 UTC m=+100.532427718 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:24.828714 master-0 kubenswrapper[4059]: I0308 00:20:24.828652 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-w5fjg"] Mar 08 00:20:24.828965 master-0 kubenswrapper[4059]: I0308 00:20:24.828945 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:24.829183 master-0 kubenswrapper[4059]: E0308 00:20:24.829077 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:24.921503 master-0 kubenswrapper[4059]: I0308 00:20:24.921452 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:25.022855 master-0 kubenswrapper[4059]: I0308 00:20:25.022548 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:25.133302 master-0 kubenswrapper[4059]: E0308 00:20:25.132021 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:25.133302 master-0 kubenswrapper[4059]: E0308 00:20:25.132073 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:25.133302 master-0 kubenswrapper[4059]: E0308 00:20:25.132088 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:25.133302 master-0 kubenswrapper[4059]: E0308 00:20:25.132160 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:25.632140687 +0000 UTC m=+69.343740299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:25.726605 master-0 kubenswrapper[4059]: I0308 00:20:25.726566 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:25.727584 master-0 kubenswrapper[4059]: E0308 00:20:25.726698 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:25.727584 master-0 kubenswrapper[4059]: E0308 00:20:25.726713 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:25.727584 master-0 kubenswrapper[4059]: E0308 00:20:25.726723 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:25.727584 master-0 kubenswrapper[4059]: E0308 00:20:25.726764 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:26.72675173 +0000 UTC m=+70.438351252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:26.133595 master-0 kubenswrapper[4059]: I0308 00:20:26.133554 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:26.133783 master-0 kubenswrapper[4059]: E0308 00:20:26.133671 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:26.230852 master-0 kubenswrapper[4059]: I0308 00:20:26.230821 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:26.230984 master-0 kubenswrapper[4059]: E0308 00:20:26.230961 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:26.231032 master-0 kubenswrapper[4059]: E0308 00:20:26.231018 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:42.231004179 +0000 UTC m=+85.942603701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:26.735104 master-0 kubenswrapper[4059]: I0308 00:20:26.734998 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:26.735566 master-0 kubenswrapper[4059]: E0308 00:20:26.735184 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:26.735566 master-0 kubenswrapper[4059]: E0308 00:20:26.735216 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:26.735566 master-0 kubenswrapper[4059]: E0308 00:20:26.735227 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:26.735566 master-0 kubenswrapper[4059]: E0308 00:20:26.735274 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:28.735260257 +0000 UTC m=+72.446859779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:27.133939 master-0 kubenswrapper[4059]: I0308 00:20:27.133906 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:27.134659 master-0 kubenswrapper[4059]: E0308 00:20:27.134611 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:27.378637 master-0 kubenswrapper[4059]: I0308 00:20:27.378575 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="7264af89c3bcf80c9a189b3bddcd203436764c691f9c5c52533e7f598dddfac4" exitCode=0 Mar 08 00:20:27.378637 master-0 kubenswrapper[4059]: I0308 00:20:27.378639 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"7264af89c3bcf80c9a189b3bddcd203436764c691f9c5c52533e7f598dddfac4"} Mar 08 00:20:27.436337 master-0 kubenswrapper[4059]: I0308 00:20:27.436283 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-m7549"] Mar 08 00:20:27.436669 master-0 kubenswrapper[4059]: I0308 00:20:27.436638 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.438916 master-0 kubenswrapper[4059]: I0308 00:20:27.438878 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:20:27.439277 master-0 kubenswrapper[4059]: I0308 00:20:27.439239 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:20:27.439862 master-0 kubenswrapper[4059]: I0308 00:20:27.439842 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:20:27.439920 master-0 kubenswrapper[4059]: I0308 00:20:27.439893 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:20:27.439978 master-0 kubenswrapper[4059]: I0308 00:20:27.439968 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:20:27.547111 master-0 kubenswrapper[4059]: I0308 00:20:27.540395 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.547111 master-0 kubenswrapper[4059]: I0308 00:20:27.540468 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.547111 master-0 kubenswrapper[4059]: I0308 00:20:27.540508 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.547111 master-0 kubenswrapper[4059]: I0308 00:20:27.540529 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.641857 master-0 kubenswrapper[4059]: I0308 00:20:27.641747 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.641857 master-0 kubenswrapper[4059]: I0308 00:20:27.641794 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.642050 master-0 kubenswrapper[4059]: E0308 00:20:27.641915 4059 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 08 00:20:27.642099 master-0 kubenswrapper[4059]: I0308 00:20:27.642050 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.642149 master-0 kubenswrapper[4059]: E0308 00:20:27.642073 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert podName:af391724-079a-4bac-a89e-978ffd471763 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:28.142050602 +0000 UTC m=+71.853650124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert") pod "network-node-identity-m7549" (UID: "af391724-079a-4bac-a89e-978ffd471763") : secret "network-node-identity-cert" not found Mar 08 00:20:27.642149 master-0 kubenswrapper[4059]: I0308 00:20:27.642128 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.642816 master-0 kubenswrapper[4059]: I0308 00:20:27.642778 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.643449 master-0 kubenswrapper[4059]: I0308 00:20:27.643422 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:27.657065 master-0 kubenswrapper[4059]: I0308 00:20:27.657025 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:28.134222 master-0 kubenswrapper[4059]: I0308 00:20:28.134135 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:28.134871 master-0 kubenswrapper[4059]: E0308 00:20:28.134430 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:28.145120 master-0 kubenswrapper[4059]: I0308 00:20:28.145074 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:28.149142 master-0 kubenswrapper[4059]: I0308 00:20:28.149119 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:28.358188 master-0 kubenswrapper[4059]: I0308 00:20:28.358135 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:20:28.372866 master-0 kubenswrapper[4059]: W0308 00:20:28.372820 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf391724_079a_4bac_a89e_978ffd471763.slice/crio-c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e WatchSource:0}: Error finding container c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e: Status 404 returned error can't find the container with id c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e Mar 08 00:20:28.382422 master-0 kubenswrapper[4059]: I0308 00:20:28.382385 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e"} Mar 08 00:20:28.750939 master-0 kubenswrapper[4059]: I0308 00:20:28.750849 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:28.751273 master-0 kubenswrapper[4059]: E0308 00:20:28.750993 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:28.751273 master-0 kubenswrapper[4059]: E0308 00:20:28.751010 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:28.751273 master-0 kubenswrapper[4059]: E0308 00:20:28.751020 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:28.751273 master-0 kubenswrapper[4059]: E0308 00:20:28.751069 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:32.751055781 +0000 UTC m=+76.462655303 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:29.138100 master-0 kubenswrapper[4059]: I0308 00:20:29.134249 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:29.138100 master-0 kubenswrapper[4059]: E0308 00:20:29.134390 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:29.387804 master-0 kubenswrapper[4059]: I0308 00:20:29.387763 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerStarted","Data":"d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e"} Mar 08 00:20:30.133872 master-0 kubenswrapper[4059]: I0308 00:20:30.133824 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:30.134179 master-0 kubenswrapper[4059]: E0308 00:20:30.133960 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:30.393410 master-0 kubenswrapper[4059]: I0308 00:20:30.393308 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e" exitCode=0 Mar 08 00:20:30.393410 master-0 kubenswrapper[4059]: I0308 00:20:30.393346 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e"} Mar 08 00:20:31.134371 master-0 kubenswrapper[4059]: I0308 00:20:31.134188 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:31.134705 master-0 kubenswrapper[4059]: E0308 00:20:31.134383 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:32.281072 master-0 kubenswrapper[4059]: I0308 00:20:32.280595 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:32.281072 master-0 kubenswrapper[4059]: E0308 00:20:32.280733 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:32.962784 master-0 kubenswrapper[4059]: I0308 00:20:32.962727 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:32.963248 master-0 kubenswrapper[4059]: E0308 00:20:32.963222 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:32.963313 master-0 kubenswrapper[4059]: E0308 00:20:32.963251 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:32.963379 master-0 kubenswrapper[4059]: E0308 00:20:32.963263 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:32.963637 master-0 kubenswrapper[4059]: E0308 00:20:32.963619 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:40.963400924 +0000 UTC m=+84.675000446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:33.133834 master-0 kubenswrapper[4059]: I0308 00:20:33.133788 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:33.134022 master-0 kubenswrapper[4059]: E0308 00:20:33.133926 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:34.134031 master-0 kubenswrapper[4059]: I0308 00:20:34.133982 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:34.134751 master-0 kubenswrapper[4059]: E0308 00:20:34.134128 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:36.313428 master-0 kubenswrapper[4059]: I0308 00:20:36.312308 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:36.313428 master-0 kubenswrapper[4059]: E0308 00:20:36.312434 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:36.313428 master-0 kubenswrapper[4059]: I0308 00:20:36.313026 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:36.313428 master-0 kubenswrapper[4059]: E0308 00:20:36.313085 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:37.503486 master-0 kubenswrapper[4059]: I0308 00:20:37.503440 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:37.504324 master-0 kubenswrapper[4059]: E0308 00:20:37.503556 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:38.133379 master-0 kubenswrapper[4059]: I0308 00:20:38.133321 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:38.133590 master-0 kubenswrapper[4059]: E0308 00:20:38.133440 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:39.192007 master-0 kubenswrapper[4059]: I0308 00:20:39.191509 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:39.192007 master-0 kubenswrapper[4059]: E0308 00:20:39.191639 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:40.134284 master-0 kubenswrapper[4059]: I0308 00:20:40.134189 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:40.134454 master-0 kubenswrapper[4059]: E0308 00:20:40.134363 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:41.005168 master-0 kubenswrapper[4059]: I0308 00:20:41.005121 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:41.005611 master-0 kubenswrapper[4059]: E0308 00:20:41.005273 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:41.005611 master-0 kubenswrapper[4059]: E0308 00:20:41.005288 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:41.005611 master-0 kubenswrapper[4059]: E0308 00:20:41.005298 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:41.005611 master-0 kubenswrapper[4059]: E0308 00:20:41.005347 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:20:57.005334668 +0000 UTC m=+100.716934190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:41.134661 master-0 kubenswrapper[4059]: I0308 00:20:41.134575 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:41.134846 master-0 kubenswrapper[4059]: E0308 00:20:41.134785 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:42.133790 master-0 kubenswrapper[4059]: I0308 00:20:42.133696 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:42.134698 master-0 kubenswrapper[4059]: E0308 00:20:42.133846 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:42.316835 master-0 kubenswrapper[4059]: I0308 00:20:42.316558 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:42.316835 master-0 kubenswrapper[4059]: E0308 00:20:42.316820 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:42.317091 master-0 kubenswrapper[4059]: E0308 00:20:42.316911 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:14.31688964 +0000 UTC m=+118.028489162 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:20:43.134467 master-0 kubenswrapper[4059]: I0308 00:20:43.134418 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:43.134960 master-0 kubenswrapper[4059]: E0308 00:20:43.134544 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:44.133686 master-0 kubenswrapper[4059]: I0308 00:20:44.133645 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:44.133922 master-0 kubenswrapper[4059]: E0308 00:20:44.133753 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:45.133801 master-0 kubenswrapper[4059]: I0308 00:20:45.133723 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:45.134569 master-0 kubenswrapper[4059]: E0308 00:20:45.133848 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:46.134054 master-0 kubenswrapper[4059]: I0308 00:20:46.133922 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:46.135150 master-0 kubenswrapper[4059]: E0308 00:20:46.134060 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:47.133839 master-0 kubenswrapper[4059]: I0308 00:20:47.133726 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:47.135823 master-0 kubenswrapper[4059]: E0308 00:20:47.135018 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:47.861772 master-0 kubenswrapper[4059]: I0308 00:20:47.861577 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 00:20:48.133455 master-0 kubenswrapper[4059]: I0308 00:20:48.133416 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:48.133622 master-0 kubenswrapper[4059]: E0308 00:20:48.133556 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:48.528551 master-0 kubenswrapper[4059]: I0308 00:20:48.528508 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="c7031bd4261187339ddcdbbf17642c8a944a5d40ae330e696f51959987e70da4" exitCode=0 Mar 08 00:20:48.529180 master-0 kubenswrapper[4059]: I0308 00:20:48.528574 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"c7031bd4261187339ddcdbbf17642c8a944a5d40ae330e696f51959987e70da4"} Mar 08 00:20:48.530126 master-0 kubenswrapper[4059]: I0308 00:20:48.530093 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379"} Mar 08 00:20:48.531995 master-0 kubenswrapper[4059]: I0308 00:20:48.531968 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" exitCode=0 Mar 08 00:20:48.532083 master-0 kubenswrapper[4059]: I0308 00:20:48.532060 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} Mar 08 00:20:48.533726 master-0 kubenswrapper[4059]: I0308 00:20:48.533702 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc"} Mar 08 00:20:48.533780 master-0 kubenswrapper[4059]: I0308 00:20:48.533730 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"cd375a476d29bf57c7b9e43c8cd23f02bf2bb9a153d14c3da6003737a55dbb0d"} Mar 08 00:20:48.565264 master-0 kubenswrapper[4059]: I0308 00:20:48.565185 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.565154207 podStartE2EDuration="1.565154207s" podCreationTimestamp="2026-03-08 00:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:20:48.563333885 +0000 UTC m=+92.274933427" watchObservedRunningTime="2026-03-08 00:20:48.565154207 +0000 UTC m=+92.276753729" Mar 08 00:20:48.599388 master-0 kubenswrapper[4059]: I0308 00:20:48.599167 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-m7549" podStartSLOduration=1.757243351 podStartE2EDuration="21.599144556s" podCreationTimestamp="2026-03-08 00:20:27 +0000 UTC" firstStartedPulling="2026-03-08 00:20:28.374045856 +0000 UTC m=+72.085645378" lastFinishedPulling="2026-03-08 00:20:48.215947061 +0000 UTC m=+91.927546583" observedRunningTime="2026-03-08 00:20:48.59578143 +0000 UTC m=+92.307381002" watchObservedRunningTime="2026-03-08 00:20:48.599144556 +0000 UTC m=+92.310744088" Mar 08 00:20:48.612951 master-0 kubenswrapper[4059]: I0308 00:20:48.612768 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" podStartSLOduration=3.063249213 podStartE2EDuration="27.61274038s" podCreationTimestamp="2026-03-08 00:20:21 +0000 UTC" firstStartedPulling="2026-03-08 00:20:23.524168418 +0000 UTC m=+67.235767940" lastFinishedPulling="2026-03-08 00:20:48.073659585 +0000 UTC m=+91.785259107" observedRunningTime="2026-03-08 00:20:48.612631925 +0000 UTC m=+92.324231457" watchObservedRunningTime="2026-03-08 00:20:48.61274038 +0000 UTC m=+92.324339942" Mar 08 00:20:49.133848 master-0 kubenswrapper[4059]: I0308 00:20:49.133468 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:49.134004 master-0 kubenswrapper[4059]: E0308 00:20:49.133968 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:49.542672 master-0 kubenswrapper[4059]: I0308 00:20:49.542594 4059 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="ee1bfab2130a9c72df8adc63c3382589fac2b085c9ce4752d92d10429ef61f76" exitCode=0 Mar 08 00:20:49.543838 master-0 kubenswrapper[4059]: I0308 00:20:49.542706 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"ee1bfab2130a9c72df8adc63c3382589fac2b085c9ce4752d92d10429ef61f76"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549694 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549731 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549744 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549756 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549766 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:49.549910 master-0 kubenswrapper[4059]: I0308 00:20:49.549773 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:50.107739 master-0 kubenswrapper[4059]: I0308 00:20:50.107656 4059 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tf5qg"] Mar 08 00:20:50.134451 master-0 kubenswrapper[4059]: I0308 00:20:50.134162 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:50.134451 master-0 kubenswrapper[4059]: E0308 00:20:50.134302 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:50.557452 master-0 kubenswrapper[4059]: I0308 00:20:50.556703 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerStarted","Data":"67889792ebb5d4e854f7fdede5676d644567a2db7df33390da8134c0d480ee11"} Mar 08 00:20:50.584175 master-0 kubenswrapper[4059]: I0308 00:20:50.584057 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" podStartSLOduration=3.259227219 podStartE2EDuration="41.58401704s" podCreationTimestamp="2026-03-08 00:20:09 +0000 UTC" firstStartedPulling="2026-03-08 00:20:09.755243709 +0000 UTC m=+53.466843271" lastFinishedPulling="2026-03-08 00:20:48.08003355 +0000 UTC m=+91.791633092" observedRunningTime="2026-03-08 00:20:50.58325778 +0000 UTC m=+94.294857312" watchObservedRunningTime="2026-03-08 00:20:50.58401704 +0000 UTC m=+94.295616602" Mar 08 00:20:51.133876 master-0 kubenswrapper[4059]: I0308 00:20:51.133811 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:51.134055 master-0 kubenswrapper[4059]: E0308 00:20:51.134005 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:51.563350 master-0 kubenswrapper[4059]: I0308 00:20:51.563044 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} Mar 08 00:20:52.134042 master-0 kubenswrapper[4059]: I0308 00:20:52.133982 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:52.136734 master-0 kubenswrapper[4059]: E0308 00:20:52.136673 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:52.147323 master-0 kubenswrapper[4059]: W0308 00:20:52.147245 4059 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 00:20:52.148036 master-0 kubenswrapper[4059]: I0308 00:20:52.147596 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 00:20:53.134308 master-0 kubenswrapper[4059]: I0308 00:20:53.134246 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:53.135090 master-0 kubenswrapper[4059]: E0308 00:20:53.134376 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:53.577001 master-0 kubenswrapper[4059]: I0308 00:20:53.576849 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerStarted","Data":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} Mar 08 00:20:53.577230 master-0 kubenswrapper[4059]: I0308 00:20:53.577160 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-controller" containerID="cri-o://596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" gracePeriod=30 Mar 08 00:20:53.577357 master-0 kubenswrapper[4059]: I0308 00:20:53.577285 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="northd" containerID="cri-o://0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" gracePeriod=30 Mar 08 00:20:53.577387 master-0 kubenswrapper[4059]: I0308 00:20:53.577304 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-node" containerID="cri-o://e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" gracePeriod=30 Mar 08 00:20:53.577484 master-0 kubenswrapper[4059]: I0308 00:20:53.577394 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" gracePeriod=30 Mar 08 00:20:53.577484 master-0 kubenswrapper[4059]: I0308 00:20:53.577385 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-acl-logging" containerID="cri-o://b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" gracePeriod=30 Mar 08 00:20:53.577484 master-0 kubenswrapper[4059]: I0308 00:20:53.577346 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:53.577561 master-0 kubenswrapper[4059]: I0308 00:20:53.577282 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="sbdb" containerID="cri-o://285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" gracePeriod=30 Mar 08 00:20:53.578533 master-0 kubenswrapper[4059]: I0308 00:20:53.578399 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="nbdb" containerID="cri-o://d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" gracePeriod=30 Mar 08 00:20:53.581999 master-0 kubenswrapper[4059]: E0308 00:20:53.581917 4059 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 00:20:53.583975 master-0 kubenswrapper[4059]: E0308 00:20:53.583921 4059 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 00:20:53.587607 master-0 kubenswrapper[4059]: E0308 00:20:53.587553 4059 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 00:20:53.587680 master-0 kubenswrapper[4059]: E0308 00:20:53.587619 4059 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="sbdb" Mar 08 00:20:53.601057 master-0 kubenswrapper[4059]: I0308 00:20:53.600974 4059 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovnkube-controller" containerID="cri-o://8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" gracePeriod=30 Mar 08 00:20:53.857170 master-0 kubenswrapper[4059]: I0308 00:20:53.857092 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=1.857075705 podStartE2EDuration="1.857075705s" podCreationTimestamp="2026-03-08 00:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:20:53.855582407 +0000 UTC m=+97.567181919" watchObservedRunningTime="2026-03-08 00:20:53.857075705 +0000 UTC m=+97.568675227" Mar 08 00:20:53.908487 master-0 kubenswrapper[4059]: I0308 00:20:53.908331 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" podStartSLOduration=9.303798847 podStartE2EDuration="32.908275487s" podCreationTimestamp="2026-03-08 00:20:21 +0000 UTC" firstStartedPulling="2026-03-08 00:20:23.344142997 +0000 UTC m=+67.055742529" lastFinishedPulling="2026-03-08 00:20:46.948619627 +0000 UTC m=+90.660219169" observedRunningTime="2026-03-08 00:20:53.906148715 +0000 UTC m=+97.617748247" watchObservedRunningTime="2026-03-08 00:20:53.908275487 +0000 UTC m=+97.619875009" Mar 08 00:20:54.134285 master-0 kubenswrapper[4059]: I0308 00:20:54.134229 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:54.134456 master-0 kubenswrapper[4059]: E0308 00:20:54.134381 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:54.312254 master-0 kubenswrapper[4059]: I0308 00:20:54.312227 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovnkube-controller/0.log" Mar 08 00:20:54.313369 master-0 kubenswrapper[4059]: I0308 00:20:54.313351 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 00:20:54.313685 master-0 kubenswrapper[4059]: I0308 00:20:54.313663 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/kube-rbac-proxy-node/0.log" Mar 08 00:20:54.314056 master-0 kubenswrapper[4059]: I0308 00:20:54.314035 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovn-acl-logging/0.log" Mar 08 00:20:54.314582 master-0 kubenswrapper[4059]: I0308 00:20:54.314555 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovn-controller/0.log" Mar 08 00:20:54.314891 master-0 kubenswrapper[4059]: I0308 00:20:54.314867 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:54.374358 master-0 kubenswrapper[4059]: I0308 00:20:54.374146 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2w9mf"] Mar 08 00:20:54.375105 master-0 kubenswrapper[4059]: E0308 00:20:54.375078 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-controller" Mar 08 00:20:54.375105 master-0 kubenswrapper[4059]: I0308 00:20:54.375096 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-controller" Mar 08 00:20:54.375105 master-0 kubenswrapper[4059]: E0308 00:20:54.375104 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-acl-logging" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375111 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-acl-logging" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375117 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-node" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375125 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-node" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375133 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="northd" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375139 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="northd" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375146 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="nbdb" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375152 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="nbdb" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375160 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="sbdb" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375165 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="sbdb" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375173 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375179 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375184 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovnkube-controller" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375191 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovnkube-controller" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: E0308 00:20:54.375212 4059 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kubecfg-setup" Mar 08 00:20:54.375243 master-0 kubenswrapper[4059]: I0308 00:20:54.375243 4059 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kubecfg-setup" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375283 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-acl-logging" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375290 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375296 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="northd" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375301 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="kube-rbac-proxy-node" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375306 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovn-controller" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375312 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="sbdb" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375318 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="ovnkube-controller" Mar 08 00:20:54.375607 master-0 kubenswrapper[4059]: I0308 00:20:54.375323 4059 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerName="nbdb" Mar 08 00:20:54.375919 master-0 kubenswrapper[4059]: I0308 00:20:54.375897 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.414562 master-0 kubenswrapper[4059]: I0308 00:20:54.414504 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414562 master-0 kubenswrapper[4059]: I0308 00:20:54.414550 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414580 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414603 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414625 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414642 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414660 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414680 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414697 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414715 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414734 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414755 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6n4s\" (UniqueName: \"kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414772 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414793 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.414813 master-0 kubenswrapper[4059]: I0308 00:20:54.414814 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.414836 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415075 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415101 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415111 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415128 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415149 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415172 4059 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib\") pod \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\" (UID: \"17aa8235-749b-49da-9fcd-cb4bd948f0a5\") " Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415236 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415261 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415299 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415318 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415342 master-0 kubenswrapper[4059]: I0308 00:20:54.415338 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415356 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415380 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415400 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415418 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415447 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415478 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415497 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415517 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415539 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415557 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415578 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415596 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415617 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415640 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415662 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415688 4059 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.415791 master-0 kubenswrapper[4059]: I0308 00:20:54.415702 4059 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.416518 master-0 kubenswrapper[4059]: I0308 00:20:54.415129 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log" (OuterVolumeSpecName: "node-log") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416518 master-0 kubenswrapper[4059]: I0308 00:20:54.416484 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:20:54.416518 master-0 kubenswrapper[4059]: I0308 00:20:54.415754 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash" (OuterVolumeSpecName: "host-slash") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.415778 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.415868 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.415943 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416021 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416519 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416028 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416042 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket" (OuterVolumeSpecName: "log-socket") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416055 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416062 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.415739 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.416639 master-0 kubenswrapper[4059]: I0308 00:20:54.416540 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.417075 master-0 kubenswrapper[4059]: I0308 00:20:54.416738 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:20:54.419080 master-0 kubenswrapper[4059]: I0308 00:20:54.418710 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s" (OuterVolumeSpecName: "kube-api-access-w6n4s") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "kube-api-access-w6n4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:54.419151 master-0 kubenswrapper[4059]: I0308 00:20:54.418878 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:20:54.426853 master-0 kubenswrapper[4059]: I0308 00:20:54.424737 4059 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "17aa8235-749b-49da-9fcd-cb4bd948f0a5" (UID: "17aa8235-749b-49da-9fcd-cb4bd948f0a5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:20:54.516522 master-0 kubenswrapper[4059]: I0308 00:20:54.516402 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516522 master-0 kubenswrapper[4059]: I0308 00:20:54.516473 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516537 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516620 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516646 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516661 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516715 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516733 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516749 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516762 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516777 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516790 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.516808 master-0 kubenswrapper[4059]: I0308 00:20:54.516805 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516827 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516852 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516867 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516880 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516894 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516907 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516921 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516936 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516960 4059 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516969 4059 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516978 4059 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6n4s\" (UniqueName: \"kubernetes.io/projected/17aa8235-749b-49da-9fcd-cb4bd948f0a5-kube-api-access-w6n4s\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516987 4059 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.516996 4059 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517006 4059 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517013 4059 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517021 4059 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517030 4059 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517041 4059 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517049 4059 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517058 4059 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-node-log\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517067 4059 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517191 master-0 kubenswrapper[4059]: I0308 00:20:54.517075 4059 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17aa8235-749b-49da-9fcd-cb4bd948f0a5-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517083 4059 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517092 4059 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517099 4059 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517108 4059 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17aa8235-749b-49da-9fcd-cb4bd948f0a5-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517337 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517369 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517358 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517399 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517373 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517420 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517481 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517505 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517528 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517694 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517739 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517761 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517763 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517786 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.517959 master-0 kubenswrapper[4059]: I0308 00:20:54.517868 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.518628 master-0 kubenswrapper[4059]: I0308 00:20:54.517922 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.518628 master-0 kubenswrapper[4059]: I0308 00:20:54.518388 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.519905 master-0 kubenswrapper[4059]: I0308 00:20:54.519854 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.536077 master-0 kubenswrapper[4059]: I0308 00:20:54.536034 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.580696 master-0 kubenswrapper[4059]: I0308 00:20:54.580608 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovnkube-controller/0.log" Mar 08 00:20:54.582368 master-0 kubenswrapper[4059]: I0308 00:20:54.582298 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 00:20:54.582684 master-0 kubenswrapper[4059]: I0308 00:20:54.582639 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/kube-rbac-proxy-node/0.log" Mar 08 00:20:54.583143 master-0 kubenswrapper[4059]: I0308 00:20:54.583093 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovn-acl-logging/0.log" Mar 08 00:20:54.583518 master-0 kubenswrapper[4059]: I0308 00:20:54.583446 4059 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tf5qg_17aa8235-749b-49da-9fcd-cb4bd948f0a5/ovn-controller/0.log" Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583744 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" exitCode=143 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583763 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" exitCode=0 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583770 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" exitCode=0 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583775 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" exitCode=0 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583781 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" exitCode=143 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583787 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" exitCode=143 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583792 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" exitCode=143 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583798 4059 generic.go:334] "Generic (PLEG): container finished" podID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" exitCode=143 Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583813 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583835 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583845 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583855 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583864 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583873 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583883 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583954 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583960 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583967 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583975 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583981 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583987 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583992 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.583997 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.584002 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.584007 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:54.585458 master-0 kubenswrapper[4059]: I0308 00:20:54.584012 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584017 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584023 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584030 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584036 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584042 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584047 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584052 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584057 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584061 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584067 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584072 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584080 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" event={"ID":"17aa8235-749b-49da-9fcd-cb4bd948f0a5","Type":"ContainerDied","Data":"754fcfaa5e2efe3eedf9b613fba862c812cd8de913348e84d01b6035a79a3ae6"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584088 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584094 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584099 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584104 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584109 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584113 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584118 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584123 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584128 4059 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584141 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.586558 master-0 kubenswrapper[4059]: I0308 00:20:54.584272 4059 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tf5qg" Mar 08 00:20:54.597639 master-0 kubenswrapper[4059]: I0308 00:20:54.597548 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.606538 master-0 kubenswrapper[4059]: I0308 00:20:54.606462 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.615251 master-0 kubenswrapper[4059]: I0308 00:20:54.615194 4059 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tf5qg"] Mar 08 00:20:54.621246 master-0 kubenswrapper[4059]: I0308 00:20:54.619082 4059 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tf5qg"] Mar 08 00:20:54.623411 master-0 kubenswrapper[4059]: I0308 00:20:54.623374 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.632154 master-0 kubenswrapper[4059]: I0308 00:20:54.632119 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.641951 master-0 kubenswrapper[4059]: I0308 00:20:54.641895 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.649094 master-0 kubenswrapper[4059]: I0308 00:20:54.649059 4059 scope.go:117] "RemoveContainer" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.656947 master-0 kubenswrapper[4059]: I0308 00:20:54.656928 4059 scope.go:117] "RemoveContainer" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.666445 master-0 kubenswrapper[4059]: I0308 00:20:54.666426 4059 scope.go:117] "RemoveContainer" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.676938 master-0 kubenswrapper[4059]: I0308 00:20:54.676920 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.677369 master-0 kubenswrapper[4059]: E0308 00:20:54.677350 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.677458 master-0 kubenswrapper[4059]: I0308 00:20:54.677434 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} err="failed to get container status \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" Mar 08 00:20:54.677528 master-0 kubenswrapper[4059]: I0308 00:20:54.677518 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.677931 master-0 kubenswrapper[4059]: E0308 00:20:54.677892 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.677984 master-0 kubenswrapper[4059]: I0308 00:20:54.677941 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} err="failed to get container status \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" Mar 08 00:20:54.677984 master-0 kubenswrapper[4059]: I0308 00:20:54.677974 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.678330 master-0 kubenswrapper[4059]: E0308 00:20:54.678311 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.678406 master-0 kubenswrapper[4059]: I0308 00:20:54.678388 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} err="failed to get container status \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" Mar 08 00:20:54.678471 master-0 kubenswrapper[4059]: I0308 00:20:54.678461 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.678867 master-0 kubenswrapper[4059]: E0308 00:20:54.678847 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.678945 master-0 kubenswrapper[4059]: I0308 00:20:54.678929 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} err="failed to get container status \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" Mar 08 00:20:54.678998 master-0 kubenswrapper[4059]: I0308 00:20:54.678988 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.679329 master-0 kubenswrapper[4059]: E0308 00:20:54.679299 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.679392 master-0 kubenswrapper[4059]: I0308 00:20:54.679332 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} err="failed to get container status \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" Mar 08 00:20:54.679392 master-0 kubenswrapper[4059]: I0308 00:20:54.679352 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.679694 master-0 kubenswrapper[4059]: E0308 00:20:54.679649 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.679694 master-0 kubenswrapper[4059]: I0308 00:20:54.679675 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} err="failed to get container status \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" Mar 08 00:20:54.679694 master-0 kubenswrapper[4059]: I0308 00:20:54.679691 4059 scope.go:117] "RemoveContainer" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.679961 master-0 kubenswrapper[4059]: E0308 00:20:54.679939 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": container with ID starting with b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65 not found: ID does not exist" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.680036 master-0 kubenswrapper[4059]: I0308 00:20:54.680019 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} err="failed to get container status \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": rpc error: code = NotFound desc = could not find container \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": container with ID starting with b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65 not found: ID does not exist" Mar 08 00:20:54.680088 master-0 kubenswrapper[4059]: I0308 00:20:54.680079 4059 scope.go:117] "RemoveContainer" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.680516 master-0 kubenswrapper[4059]: E0308 00:20:54.680493 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": container with ID starting with 596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7 not found: ID does not exist" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.680571 master-0 kubenswrapper[4059]: I0308 00:20:54.680517 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} err="failed to get container status \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": rpc error: code = NotFound desc = could not find container \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": container with ID starting with 596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7 not found: ID does not exist" Mar 08 00:20:54.680571 master-0 kubenswrapper[4059]: I0308 00:20:54.680537 4059 scope.go:117] "RemoveContainer" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.680888 master-0 kubenswrapper[4059]: E0308 00:20:54.680864 4059 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": container with ID starting with 4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a not found: ID does not exist" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.680966 master-0 kubenswrapper[4059]: I0308 00:20:54.680947 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} err="failed to get container status \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": rpc error: code = NotFound desc = could not find container \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": container with ID starting with 4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a not found: ID does not exist" Mar 08 00:20:54.681021 master-0 kubenswrapper[4059]: I0308 00:20:54.681011 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.681375 master-0 kubenswrapper[4059]: I0308 00:20:54.681346 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} err="failed to get container status \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" Mar 08 00:20:54.681444 master-0 kubenswrapper[4059]: I0308 00:20:54.681376 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.681660 master-0 kubenswrapper[4059]: I0308 00:20:54.681632 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} err="failed to get container status \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" Mar 08 00:20:54.681660 master-0 kubenswrapper[4059]: I0308 00:20:54.681657 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.681942 master-0 kubenswrapper[4059]: I0308 00:20:54.681895 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} err="failed to get container status \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" Mar 08 00:20:54.681942 master-0 kubenswrapper[4059]: I0308 00:20:54.681937 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.682264 master-0 kubenswrapper[4059]: I0308 00:20:54.682225 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} err="failed to get container status \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" Mar 08 00:20:54.682264 master-0 kubenswrapper[4059]: I0308 00:20:54.682260 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.682604 master-0 kubenswrapper[4059]: I0308 00:20:54.682561 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} err="failed to get container status \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" Mar 08 00:20:54.682604 master-0 kubenswrapper[4059]: I0308 00:20:54.682599 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.682861 master-0 kubenswrapper[4059]: I0308 00:20:54.682827 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} err="failed to get container status \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" Mar 08 00:20:54.682861 master-0 kubenswrapper[4059]: I0308 00:20:54.682853 4059 scope.go:117] "RemoveContainer" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.683145 master-0 kubenswrapper[4059]: I0308 00:20:54.683099 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} err="failed to get container status \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": rpc error: code = NotFound desc = could not find container \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": container with ID starting with b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65 not found: ID does not exist" Mar 08 00:20:54.683145 master-0 kubenswrapper[4059]: I0308 00:20:54.683140 4059 scope.go:117] "RemoveContainer" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.683546 master-0 kubenswrapper[4059]: I0308 00:20:54.683511 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} err="failed to get container status \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": rpc error: code = NotFound desc = could not find container \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": container with ID starting with 596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7 not found: ID does not exist" Mar 08 00:20:54.683546 master-0 kubenswrapper[4059]: I0308 00:20:54.683539 4059 scope.go:117] "RemoveContainer" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.683806 master-0 kubenswrapper[4059]: I0308 00:20:54.683766 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} err="failed to get container status \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": rpc error: code = NotFound desc = could not find container \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": container with ID starting with 4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a not found: ID does not exist" Mar 08 00:20:54.683806 master-0 kubenswrapper[4059]: I0308 00:20:54.683798 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.684051 master-0 kubenswrapper[4059]: I0308 00:20:54.684018 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} err="failed to get container status \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" Mar 08 00:20:54.684098 master-0 kubenswrapper[4059]: I0308 00:20:54.684050 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.684378 master-0 kubenswrapper[4059]: I0308 00:20:54.684347 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} err="failed to get container status \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" Mar 08 00:20:54.684426 master-0 kubenswrapper[4059]: I0308 00:20:54.684378 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.684684 master-0 kubenswrapper[4059]: I0308 00:20:54.684641 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} err="failed to get container status \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" Mar 08 00:20:54.684684 master-0 kubenswrapper[4059]: I0308 00:20:54.684676 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.684965 master-0 kubenswrapper[4059]: I0308 00:20:54.684924 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} err="failed to get container status \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" Mar 08 00:20:54.684965 master-0 kubenswrapper[4059]: I0308 00:20:54.684961 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.685230 master-0 kubenswrapper[4059]: I0308 00:20:54.685174 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} err="failed to get container status \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" Mar 08 00:20:54.685277 master-0 kubenswrapper[4059]: I0308 00:20:54.685243 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.685536 master-0 kubenswrapper[4059]: I0308 00:20:54.685498 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} err="failed to get container status \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" Mar 08 00:20:54.685627 master-0 kubenswrapper[4059]: I0308 00:20:54.685536 4059 scope.go:117] "RemoveContainer" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.685754 master-0 kubenswrapper[4059]: I0308 00:20:54.685728 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} err="failed to get container status \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": rpc error: code = NotFound desc = could not find container \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": container with ID starting with b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65 not found: ID does not exist" Mar 08 00:20:54.685754 master-0 kubenswrapper[4059]: I0308 00:20:54.685751 4059 scope.go:117] "RemoveContainer" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.685929 master-0 kubenswrapper[4059]: I0308 00:20:54.685905 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} err="failed to get container status \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": rpc error: code = NotFound desc = could not find container \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": container with ID starting with 596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7 not found: ID does not exist" Mar 08 00:20:54.685929 master-0 kubenswrapper[4059]: I0308 00:20:54.685926 4059 scope.go:117] "RemoveContainer" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.686149 master-0 kubenswrapper[4059]: I0308 00:20:54.686124 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} err="failed to get container status \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": rpc error: code = NotFound desc = could not find container \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": container with ID starting with 4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a not found: ID does not exist" Mar 08 00:20:54.686149 master-0 kubenswrapper[4059]: I0308 00:20:54.686146 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.686351 master-0 kubenswrapper[4059]: I0308 00:20:54.686326 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} err="failed to get container status \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" Mar 08 00:20:54.686351 master-0 kubenswrapper[4059]: I0308 00:20:54.686348 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.686572 master-0 kubenswrapper[4059]: I0308 00:20:54.686546 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} err="failed to get container status \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" Mar 08 00:20:54.686572 master-0 kubenswrapper[4059]: I0308 00:20:54.686570 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.686808 master-0 kubenswrapper[4059]: I0308 00:20:54.686776 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} err="failed to get container status \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" Mar 08 00:20:54.686856 master-0 kubenswrapper[4059]: I0308 00:20:54.686807 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.687070 master-0 kubenswrapper[4059]: I0308 00:20:54.687044 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} err="failed to get container status \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" Mar 08 00:20:54.687070 master-0 kubenswrapper[4059]: I0308 00:20:54.687068 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.687335 master-0 kubenswrapper[4059]: I0308 00:20:54.687310 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} err="failed to get container status \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" Mar 08 00:20:54.687335 master-0 kubenswrapper[4059]: I0308 00:20:54.687333 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.687561 master-0 kubenswrapper[4059]: I0308 00:20:54.687536 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} err="failed to get container status \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" Mar 08 00:20:54.687561 master-0 kubenswrapper[4059]: I0308 00:20:54.687560 4059 scope.go:117] "RemoveContainer" containerID="b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65" Mar 08 00:20:54.687783 master-0 kubenswrapper[4059]: I0308 00:20:54.687759 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65"} err="failed to get container status \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": rpc error: code = NotFound desc = could not find container \"b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65\": container with ID starting with b78d898694475d4fa6d413095c98ad96577df52273779c125c249edfadc5bb65 not found: ID does not exist" Mar 08 00:20:54.687783 master-0 kubenswrapper[4059]: I0308 00:20:54.687780 4059 scope.go:117] "RemoveContainer" containerID="596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7" Mar 08 00:20:54.688044 master-0 kubenswrapper[4059]: I0308 00:20:54.688019 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7"} err="failed to get container status \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": rpc error: code = NotFound desc = could not find container \"596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7\": container with ID starting with 596783bce11e4d4ed8c62c1ce5ddbd210cfb9ea1414f739c6d6c6cea3f3f5ef7 not found: ID does not exist" Mar 08 00:20:54.688044 master-0 kubenswrapper[4059]: I0308 00:20:54.688040 4059 scope.go:117] "RemoveContainer" containerID="4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a" Mar 08 00:20:54.688288 master-0 kubenswrapper[4059]: I0308 00:20:54.688264 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a"} err="failed to get container status \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": rpc error: code = NotFound desc = could not find container \"4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a\": container with ID starting with 4434fcf7523899c16bdc9ff2514392013087b11cd9c0ebc889187563c92c108a not found: ID does not exist" Mar 08 00:20:54.688288 master-0 kubenswrapper[4059]: I0308 00:20:54.688285 4059 scope.go:117] "RemoveContainer" containerID="8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1" Mar 08 00:20:54.688524 master-0 kubenswrapper[4059]: I0308 00:20:54.688496 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1"} err="failed to get container status \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": rpc error: code = NotFound desc = could not find container \"8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1\": container with ID starting with 8a206df9a5e00bc2f0057ebb5c80783d7e9aaab9c7600ea52983392822a721b1 not found: ID does not exist" Mar 08 00:20:54.688524 master-0 kubenswrapper[4059]: I0308 00:20:54.688522 4059 scope.go:117] "RemoveContainer" containerID="285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca" Mar 08 00:20:54.688805 master-0 kubenswrapper[4059]: I0308 00:20:54.688770 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca"} err="failed to get container status \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": rpc error: code = NotFound desc = could not find container \"285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca\": container with ID starting with 285f8a92befddee11e1b44cebb2bda2b34e3135f45619495ec8126c91cff97ca not found: ID does not exist" Mar 08 00:20:54.688864 master-0 kubenswrapper[4059]: I0308 00:20:54.688804 4059 scope.go:117] "RemoveContainer" containerID="d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0" Mar 08 00:20:54.689036 master-0 kubenswrapper[4059]: I0308 00:20:54.689005 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0"} err="failed to get container status \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": rpc error: code = NotFound desc = could not find container \"d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0\": container with ID starting with d6eff328ddf5c154d1d9faa1e15dce1c9d119fdbc9642cf6132900ba16f4f2f0 not found: ID does not exist" Mar 08 00:20:54.689077 master-0 kubenswrapper[4059]: I0308 00:20:54.689036 4059 scope.go:117] "RemoveContainer" containerID="0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd" Mar 08 00:20:54.689317 master-0 kubenswrapper[4059]: I0308 00:20:54.689284 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd"} err="failed to get container status \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": rpc error: code = NotFound desc = could not find container \"0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd\": container with ID starting with 0dd7bb2a0dfc9063dde38d2646f27b410a8fc9cd8b6415aa5596ba64aa7ab6bd not found: ID does not exist" Mar 08 00:20:54.689380 master-0 kubenswrapper[4059]: I0308 00:20:54.689317 4059 scope.go:117] "RemoveContainer" containerID="4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a" Mar 08 00:20:54.689612 master-0 kubenswrapper[4059]: I0308 00:20:54.689578 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a"} err="failed to get container status \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": rpc error: code = NotFound desc = could not find container \"4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a\": container with ID starting with 4d8ea9cb9beab0a422f7804da4b819ae9934f5441003e47d5346b1954ca42d4a not found: ID does not exist" Mar 08 00:20:54.689658 master-0 kubenswrapper[4059]: I0308 00:20:54.689614 4059 scope.go:117] "RemoveContainer" containerID="e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5" Mar 08 00:20:54.689883 master-0 kubenswrapper[4059]: I0308 00:20:54.689857 4059 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5"} err="failed to get container status \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": rpc error: code = NotFound desc = could not find container \"e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5\": container with ID starting with e8c793f3a4e6d5f624df14908f64885c436913b6ed379ffbe261c0ca922df7e5 not found: ID does not exist" Mar 08 00:20:54.691146 master-0 kubenswrapper[4059]: I0308 00:20:54.691113 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:54.702108 master-0 kubenswrapper[4059]: W0308 00:20:54.702074 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf0db9d_51bb_41c8_a2f8_3aaa0df679e9.slice/crio-b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9 WatchSource:0}: Error finding container b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9: Status 404 returned error can't find the container with id b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9 Mar 08 00:20:55.134637 master-0 kubenswrapper[4059]: I0308 00:20:55.134571 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:55.135040 master-0 kubenswrapper[4059]: E0308 00:20:55.134727 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:55.140610 master-0 kubenswrapper[4059]: I0308 00:20:55.140566 4059 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17aa8235-749b-49da-9fcd-cb4bd948f0a5" path="/var/lib/kubelet/pods/17aa8235-749b-49da-9fcd-cb4bd948f0a5/volumes" Mar 08 00:20:55.588926 master-0 kubenswrapper[4059]: I0308 00:20:55.588840 4059 generic.go:334] "Generic (PLEG): container finished" podID="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" containerID="9caf746e34f3ceb9b7a0c15d058a8c3ef6549037b6840e762c5d26db1b3afa1f" exitCode=0 Mar 08 00:20:55.589100 master-0 kubenswrapper[4059]: I0308 00:20:55.588948 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerDied","Data":"9caf746e34f3ceb9b7a0c15d058a8c3ef6549037b6840e762c5d26db1b3afa1f"} Mar 08 00:20:55.589100 master-0 kubenswrapper[4059]: I0308 00:20:55.589046 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9"} Mar 08 00:20:56.133828 master-0 kubenswrapper[4059]: I0308 00:20:56.133702 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:56.133828 master-0 kubenswrapper[4059]: E0308 00:20:56.133812 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596873 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"a02e1889bcd7a1cba9295de5ccf81a5e8bc3df65e6e184f470b35e714f23b1f8"} Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596920 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"0e49be2e23bf477ec14120fb40ddb29719e2b5af3f6beddaffe4770b79d6d46c"} Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596929 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"02f576e5daa548d8e13a03a2b6ed259a4ebcc6364353cb4ddfacfe054ec613fd"} Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596937 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"9c94225f58476ed65c6fffafd91e571dd0b2ef9f295936cd52d8b1901c360298"} Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596946 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"2d125229cb85fc818282339c6e80d7cb921f3ddfa9564d713ed3dd5e74ec9a38"} Mar 08 00:20:56.596940 master-0 kubenswrapper[4059]: I0308 00:20:56.596954 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"54c6e47bd54c96d470a2821fbe979f217369d59cac4fe994745beff2a29276c1"} Mar 08 00:20:56.838445 master-0 kubenswrapper[4059]: I0308 00:20:56.838386 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:20:56.838628 master-0 kubenswrapper[4059]: E0308 00:20:56.838521 4059 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:56.838628 master-0 kubenswrapper[4059]: E0308 00:20:56.838576 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:22:00.838558369 +0000 UTC m=+164.550157891 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:20:57.040257 master-0 kubenswrapper[4059]: I0308 00:20:57.040148 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:57.042348 master-0 kubenswrapper[4059]: E0308 00:20:57.040309 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:20:57.042348 master-0 kubenswrapper[4059]: E0308 00:20:57.040335 4059 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:20:57.042348 master-0 kubenswrapper[4059]: E0308 00:20:57.040350 4059 projected.go:194] Error preparing data for projected volume kube-api-access-wh9cz for pod openshift-network-diagnostics/network-check-target-w5fjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:57.042348 master-0 kubenswrapper[4059]: E0308 00:20:57.040403 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz podName:1f63cb2f-779f-4fde-bf92-cf0414844a77 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.040386624 +0000 UTC m=+132.751986146 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wh9cz" (UniqueName: "kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz") pod "network-check-target-w5fjg" (UID: "1f63cb2f-779f-4fde-bf92-cf0414844a77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:20:57.134642 master-0 kubenswrapper[4059]: I0308 00:20:57.134574 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:57.135988 master-0 kubenswrapper[4059]: E0308 00:20:57.135864 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:58.133868 master-0 kubenswrapper[4059]: I0308 00:20:58.133818 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:20:58.134491 master-0 kubenswrapper[4059]: E0308 00:20:58.133926 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:20:58.610953 master-0 kubenswrapper[4059]: I0308 00:20:58.610855 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"ceff6be6c2bd2d352cdfcc056386b4f3985ee7a4045231ee2b8afcebd43ff3a7"} Mar 08 00:20:59.133991 master-0 kubenswrapper[4059]: I0308 00:20:59.133507 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:20:59.135275 master-0 kubenswrapper[4059]: E0308 00:20:59.134159 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:20:59.527694 master-0 kubenswrapper[4059]: I0308 00:20:59.527596 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 00:20:59.621322 master-0 kubenswrapper[4059]: I0308 00:20:59.621232 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"c407babaf75d2857cc7e7f6f987ae592ab0417bd9fa8a7e43b350cf7332b8d44"} Mar 08 00:20:59.622047 master-0 kubenswrapper[4059]: I0308 00:20:59.621682 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:59.622047 master-0 kubenswrapper[4059]: I0308 00:20:59.621716 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:59.622047 master-0 kubenswrapper[4059]: I0308 00:20:59.621733 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:59.652686 master-0 kubenswrapper[4059]: I0308 00:20:59.651499 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:20:59.655449 master-0 kubenswrapper[4059]: I0308 00:20:59.655335 4059 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:00.099298 master-0 kubenswrapper[4059]: I0308 00:21:00.099212 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.099175014 podStartE2EDuration="1.099175014s" podCreationTimestamp="2026-03-08 00:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:20:59.938017085 +0000 UTC m=+103.649616627" watchObservedRunningTime="2026-03-08 00:21:00.099175014 +0000 UTC m=+103.810774536" Mar 08 00:21:00.134670 master-0 kubenswrapper[4059]: I0308 00:21:00.134535 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:00.135521 master-0 kubenswrapper[4059]: E0308 00:21:00.134818 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:21:00.872317 master-0 kubenswrapper[4059]: I0308 00:21:00.870451 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" podStartSLOduration=6.8704315000000005 podStartE2EDuration="6.8704315s" podCreationTimestamp="2026-03-08 00:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:00.100030148 +0000 UTC m=+103.811629680" watchObservedRunningTime="2026-03-08 00:21:00.8704315 +0000 UTC m=+104.582031032" Mar 08 00:21:01.134437 master-0 kubenswrapper[4059]: I0308 00:21:01.134286 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:01.134437 master-0 kubenswrapper[4059]: E0308 00:21:01.134417 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:21:02.134044 master-0 kubenswrapper[4059]: I0308 00:21:02.133970 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:02.134624 master-0 kubenswrapper[4059]: E0308 00:21:02.134087 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:21:02.194613 master-0 kubenswrapper[4059]: I0308 00:21:02.194437 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-krv7c"] Mar 08 00:21:02.198721 master-0 kubenswrapper[4059]: I0308 00:21:02.198659 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w5fjg"] Mar 08 00:21:02.198863 master-0 kubenswrapper[4059]: I0308 00:21:02.198805 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:02.199606 master-0 kubenswrapper[4059]: E0308 00:21:02.198929 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:21:02.631292 master-0 kubenswrapper[4059]: I0308 00:21:02.631239 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:02.631533 master-0 kubenswrapper[4059]: E0308 00:21:02.631420 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:21:04.133588 master-0 kubenswrapper[4059]: I0308 00:21:04.133477 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:04.134182 master-0 kubenswrapper[4059]: I0308 00:21:04.133522 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:04.134182 master-0 kubenswrapper[4059]: E0308 00:21:04.133641 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:21:04.134182 master-0 kubenswrapper[4059]: E0308 00:21:04.133787 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:21:06.134518 master-0 kubenswrapper[4059]: I0308 00:21:06.134421 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:06.135676 master-0 kubenswrapper[4059]: I0308 00:21:06.134447 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:06.135676 master-0 kubenswrapper[4059]: E0308 00:21:06.134622 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w5fjg" podUID="1f63cb2f-779f-4fde-bf92-cf0414844a77" Mar 08 00:21:06.135676 master-0 kubenswrapper[4059]: E0308 00:21:06.134717 4059 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-krv7c" podUID="815fd565-0609-4d8f-ac05-8656f198b008" Mar 08 00:21:07.093651 master-0 kubenswrapper[4059]: I0308 00:21:07.093495 4059 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 08 00:21:07.093868 master-0 kubenswrapper[4059]: I0308 00:21:07.093702 4059 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 08 00:21:07.802313 master-0 kubenswrapper[4059]: I0308 00:21:07.802271 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 00:21:08.134509 master-0 kubenswrapper[4059]: I0308 00:21:08.134383 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:08.134733 master-0 kubenswrapper[4059]: I0308 00:21:08.134490 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:08.136977 master-0 kubenswrapper[4059]: I0308 00:21:08.136946 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:21:08.137189 master-0 kubenswrapper[4059]: I0308 00:21:08.136977 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:21:08.138019 master-0 kubenswrapper[4059]: I0308 00:21:08.137991 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:21:09.217127 master-0 kubenswrapper[4059]: I0308 00:21:09.215605 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm"] Mar 08 00:21:09.217127 master-0 kubenswrapper[4059]: I0308 00:21:09.216144 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.219025 master-0 kubenswrapper[4059]: I0308 00:21:09.219003 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.219185 master-0 kubenswrapper[4059]: I0308 00:21:09.219012 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:21:09.219311 master-0 kubenswrapper[4059]: I0308 00:21:09.219014 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:21:09.219623 master-0 kubenswrapper[4059]: I0308 00:21:09.219549 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:21:09.252088 master-0 kubenswrapper[4059]: I0308 00:21:09.252036 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.252240 master-0 kubenswrapper[4059]: I0308 00:21:09.252143 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.252284 master-0 kubenswrapper[4059]: I0308 00:21:09.252259 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.353038 master-0 kubenswrapper[4059]: I0308 00:21:09.352917 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.353038 master-0 kubenswrapper[4059]: I0308 00:21:09.353000 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.353405 master-0 kubenswrapper[4059]: I0308 00:21:09.353079 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.354937 master-0 kubenswrapper[4059]: I0308 00:21:09.354865 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.363340 master-0 kubenswrapper[4059]: I0308 00:21:09.362291 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.595231 master-0 kubenswrapper[4059]: I0308 00:21:09.595141 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx"] Mar 08 00:21:09.595658 master-0 kubenswrapper[4059]: I0308 00:21:09.595622 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2"] Mar 08 00:21:09.595887 master-0 kubenswrapper[4059]: I0308 00:21:09.595853 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4"] Mar 08 00:21:09.596235 master-0 kubenswrapper[4059]: I0308 00:21:09.596175 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.597144 master-0 kubenswrapper[4059]: I0308 00:21:09.596718 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.597248 master-0 kubenswrapper[4059]: I0308 00:21:09.597127 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.601122 master-0 kubenswrapper[4059]: I0308 00:21:09.601059 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl"] Mar 08 00:21:09.602050 master-0 kubenswrapper[4059]: I0308 00:21:09.602007 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2"] Mar 08 00:21:09.602323 master-0 kubenswrapper[4059]: I0308 00:21:09.602247 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.602414 master-0 kubenswrapper[4059]: I0308 00:21:09.602368 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-27phk"] Mar 08 00:21:09.602600 master-0 kubenswrapper[4059]: I0308 00:21:09.602540 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.603073 master-0 kubenswrapper[4059]: I0308 00:21:09.603027 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-gmvnl"] Mar 08 00:21:09.603694 master-0 kubenswrapper[4059]: I0308 00:21:09.603657 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:09.603913 master-0 kubenswrapper[4059]: I0308 00:21:09.603870 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.604106 master-0 kubenswrapper[4059]: I0308 00:21:09.604064 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-blw5x"] Mar 08 00:21:09.604633 master-0 kubenswrapper[4059]: I0308 00:21:09.604591 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.608337 master-0 kubenswrapper[4059]: I0308 00:21:09.608284 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v"] Mar 08 00:21:09.608921 master-0 kubenswrapper[4059]: I0308 00:21:09.608871 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.609395 master-0 kubenswrapper[4059]: I0308 00:21:09.609336 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886"] Mar 08 00:21:09.609977 master-0 kubenswrapper[4059]: I0308 00:21:09.609923 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.611799 master-0 kubenswrapper[4059]: I0308 00:21:09.611765 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj"] Mar 08 00:21:09.612636 master-0 kubenswrapper[4059]: I0308 00:21:09.612577 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:09.625250 master-0 kubenswrapper[4059]: I0308 00:21:09.623612 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s"] Mar 08 00:21:09.625250 master-0 kubenswrapper[4059]: I0308 00:21:09.624286 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.629099 master-0 kubenswrapper[4059]: I0308 00:21:09.625994 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq"] Mar 08 00:21:09.629099 master-0 kubenswrapper[4059]: I0308 00:21:09.626934 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632031 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632371 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632521 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632740 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632763 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632908 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.632951 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633068 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633090 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk"] Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633122 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633490 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633686 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vnl28"] Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633717 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633735 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633904 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633962 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.633997 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.634043 master-0 kubenswrapper[4059]: I0308 00:21:09.634067 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634121 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634255 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634349 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634472 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634500 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634473 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634573 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634607 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.633908 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:21:09.634809 master-0 kubenswrapper[4059]: I0308 00:21:09.634763 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.635105 master-0 kubenswrapper[4059]: I0308 00:21:09.634868 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:21:09.635105 master-0 kubenswrapper[4059]: I0308 00:21:09.634974 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:21:09.635170 master-0 kubenswrapper[4059]: I0308 00:21:09.635153 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:21:09.635419 master-0 kubenswrapper[4059]: I0308 00:21:09.635380 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:21:09.635419 master-0 kubenswrapper[4059]: I0308 00:21:09.635386 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:21:09.635498 master-0 kubenswrapper[4059]: I0308 00:21:09.635435 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:21:09.635571 master-0 kubenswrapper[4059]: I0308 00:21:09.635549 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:21:09.635802 master-0 kubenswrapper[4059]: I0308 00:21:09.635755 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:21:09.636106 master-0 kubenswrapper[4059]: I0308 00:21:09.635955 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:21:09.636106 master-0 kubenswrapper[4059]: I0308 00:21:09.636039 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:21:09.636191 master-0 kubenswrapper[4059]: I0308 00:21:09.636164 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:21:09.641563 master-0 kubenswrapper[4059]: I0308 00:21:09.636453 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:21:09.646681 master-0 kubenswrapper[4059]: I0308 00:21:09.646606 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:21:09.647005 master-0 kubenswrapper[4059]: I0308 00:21:09.646970 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:21:09.648831 master-0 kubenswrapper[4059]: I0308 00:21:09.647318 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:21:09.659045 master-0 kubenswrapper[4059]: I0308 00:21:09.658993 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.659529 master-0 kubenswrapper[4059]: I0308 00:21:09.659490 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 00:21:09.659574 master-0 kubenswrapper[4059]: I0308 00:21:09.659512 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.671416 master-0 kubenswrapper[4059]: I0308 00:21:09.671359 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:21:09.672162 master-0 kubenswrapper[4059]: I0308 00:21:09.672107 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:21:09.674936 master-0 kubenswrapper[4059]: I0308 00:21:09.673858 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 00:21:09.677399 master-0 kubenswrapper[4059]: I0308 00:21:09.677371 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.677469 master-0 kubenswrapper[4059]: I0308 00:21:09.677401 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.677616 master-0 kubenswrapper[4059]: I0308 00:21:09.677587 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 00:21:09.677693 master-0 kubenswrapper[4059]: I0308 00:21:09.677669 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:21:09.677773 master-0 kubenswrapper[4059]: I0308 00:21:09.677751 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 00:21:09.677811 master-0 kubenswrapper[4059]: I0308 00:21:09.677785 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:21:09.677853 master-0 kubenswrapper[4059]: I0308 00:21:09.677827 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.678026 master-0 kubenswrapper[4059]: I0308 00:21:09.677989 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9"] Mar 08 00:21:09.678162 master-0 kubenswrapper[4059]: I0308 00:21:09.678132 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:21:09.679749 master-0 kubenswrapper[4059]: I0308 00:21:09.678755 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.681551 master-0 kubenswrapper[4059]: I0308 00:21:09.681512 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:21:09.681636 master-0 kubenswrapper[4059]: I0308 00:21:09.681610 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:21:09.684781 master-0 kubenswrapper[4059]: I0308 00:21:09.684729 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq"] Mar 08 00:21:09.685274 master-0 kubenswrapper[4059]: I0308 00:21:09.685227 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.685378 master-0 kubenswrapper[4059]: I0308 00:21:09.685314 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4"] Mar 08 00:21:09.685579 master-0 kubenswrapper[4059]: I0308 00:21:09.685499 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 00:21:09.685687 master-0 kubenswrapper[4059]: I0308 00:21:09.685649 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:21:09.685783 master-0 kubenswrapper[4059]: I0308 00:21:09.685730 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 00:21:09.685898 master-0 kubenswrapper[4059]: I0308 00:21:09.685866 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 00:21:09.686100 master-0 kubenswrapper[4059]: I0308 00:21:09.686042 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.686259 master-0 kubenswrapper[4059]: I0308 00:21:09.686227 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:09.686680 master-0 kubenswrapper[4059]: I0308 00:21:09.686636 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:09.688480 master-0 kubenswrapper[4059]: I0308 00:21:09.688420 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f"] Mar 08 00:21:09.689021 master-0 kubenswrapper[4059]: I0308 00:21:09.688979 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf"] Mar 08 00:21:09.689227 master-0 kubenswrapper[4059]: I0308 00:21:09.689157 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:09.689338 master-0 kubenswrapper[4059]: I0308 00:21:09.689311 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm"] Mar 08 00:21:09.689408 master-0 kubenswrapper[4059]: I0308 00:21:09.689396 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:09.692588 master-0 kubenswrapper[4059]: I0308 00:21:09.692554 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:21:09.692729 master-0 kubenswrapper[4059]: I0308 00:21:09.692548 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:21:09.692772 master-0 kubenswrapper[4059]: I0308 00:21:09.692737 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:21:09.696671 master-0 kubenswrapper[4059]: I0308 00:21:09.696631 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:21:09.696868 master-0 kubenswrapper[4059]: I0308 00:21:09.696811 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:21:09.696939 master-0 kubenswrapper[4059]: I0308 00:21:09.696871 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.697031 master-0 kubenswrapper[4059]: I0308 00:21:09.696994 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:21:09.697031 master-0 kubenswrapper[4059]: I0308 00:21:09.697027 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:21:09.697187 master-0 kubenswrapper[4059]: I0308 00:21:09.697165 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:21:09.697187 master-0 kubenswrapper[4059]: I0308 00:21:09.697174 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:21:09.697301 master-0 kubenswrapper[4059]: I0308 00:21:09.697221 4059 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:21:09.697424 master-0 kubenswrapper[4059]: I0308 00:21:09.697396 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 00:21:09.698042 master-0 kubenswrapper[4059]: I0308 00:21:09.698016 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:21:09.703780 master-0 kubenswrapper[4059]: I0308 00:21:09.703712 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 00:21:09.772059 master-0 kubenswrapper[4059]: I0308 00:21:09.772027 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.772153 master-0 kubenswrapper[4059]: I0308 00:21:09.772110 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.772310 master-0 kubenswrapper[4059]: I0308 00:21:09.772292 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.772478 master-0 kubenswrapper[4059]: I0308 00:21:09.772456 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.772529 master-0 kubenswrapper[4059]: I0308 00:21:09.772491 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.772622 master-0 kubenswrapper[4059]: I0308 00:21:09.772605 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.772670 master-0 kubenswrapper[4059]: I0308 00:21:09.772631 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.772698 master-0 kubenswrapper[4059]: I0308 00:21:09.772671 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.772870 master-0 kubenswrapper[4059]: I0308 00:21:09.772804 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.772917 master-0 kubenswrapper[4059]: I0308 00:21:09.772890 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.772946 master-0 kubenswrapper[4059]: I0308 00:21:09.772927 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.772980 master-0 kubenswrapper[4059]: I0308 00:21:09.772963 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.773012 master-0 kubenswrapper[4059]: I0308 00:21:09.772999 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.773061 master-0 kubenswrapper[4059]: I0308 00:21:09.773040 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.773097 master-0 kubenswrapper[4059]: I0308 00:21:09.773071 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.773138 master-0 kubenswrapper[4059]: I0308 00:21:09.773106 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.773168 master-0 kubenswrapper[4059]: I0308 00:21:09.773135 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.773168 master-0 kubenswrapper[4059]: I0308 00:21:09.773156 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.773261 master-0 kubenswrapper[4059]: I0308 00:21:09.773225 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.773302 master-0 kubenswrapper[4059]: I0308 00:21:09.773278 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.773344 master-0 kubenswrapper[4059]: I0308 00:21:09.773319 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.773412 master-0 kubenswrapper[4059]: I0308 00:21:09.773393 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.773455 master-0 kubenswrapper[4059]: I0308 00:21:09.773418 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.773455 master-0 kubenswrapper[4059]: I0308 00:21:09.773448 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.773515 master-0 kubenswrapper[4059]: I0308 00:21:09.773468 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.773515 master-0 kubenswrapper[4059]: I0308 00:21:09.773487 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.773515 master-0 kubenswrapper[4059]: I0308 00:21:09.773507 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.773594 master-0 kubenswrapper[4059]: I0308 00:21:09.773528 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.773594 master-0 kubenswrapper[4059]: I0308 00:21:09.773552 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.773594 master-0 kubenswrapper[4059]: I0308 00:21:09.773575 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.773594 master-0 kubenswrapper[4059]: I0308 00:21:09.773591 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.773695 master-0 kubenswrapper[4059]: I0308 00:21:09.773641 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.773727 master-0 kubenswrapper[4059]: I0308 00:21:09.773696 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.773760 master-0 kubenswrapper[4059]: I0308 00:21:09.773730 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:09.773791 master-0 kubenswrapper[4059]: I0308 00:21:09.773760 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.773791 master-0 kubenswrapper[4059]: I0308 00:21:09.773781 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:09.773844 master-0 kubenswrapper[4059]: I0308 00:21:09.773805 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.773844 master-0 kubenswrapper[4059]: I0308 00:21:09.773827 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.773906 master-0 kubenswrapper[4059]: I0308 00:21:09.773878 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.773938 master-0 kubenswrapper[4059]: I0308 00:21:09.773914 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.773938 master-0 kubenswrapper[4059]: I0308 00:21:09.773935 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.773990 master-0 kubenswrapper[4059]: I0308 00:21:09.773957 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.773990 master-0 kubenswrapper[4059]: I0308 00:21:09.773979 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:09.774050 master-0 kubenswrapper[4059]: I0308 00:21:09.773998 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:09.774077 master-0 kubenswrapper[4059]: I0308 00:21:09.774048 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.774105 master-0 kubenswrapper[4059]: I0308 00:21:09.774086 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.774132 master-0 kubenswrapper[4059]: I0308 00:21:09.774116 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.774234 master-0 kubenswrapper[4059]: I0308 00:21:09.774187 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.774309 master-0 kubenswrapper[4059]: I0308 00:21:09.774252 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.774309 master-0 kubenswrapper[4059]: I0308 00:21:09.774282 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.829933 master-0 kubenswrapper[4059]: I0308 00:21:09.829910 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:09.875240 master-0 kubenswrapper[4059]: I0308 00:21:09.875082 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.875240 master-0 kubenswrapper[4059]: I0308 00:21:09.875178 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.875452 master-0 kubenswrapper[4059]: I0308 00:21:09.875265 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.875452 master-0 kubenswrapper[4059]: I0308 00:21:09.875309 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:09.875452 master-0 kubenswrapper[4059]: E0308 00:21:09.875431 4059 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:09.875564 master-0 kubenswrapper[4059]: I0308 00:21:09.875481 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.875564 master-0 kubenswrapper[4059]: E0308 00:21:09.875527 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.375499657 +0000 UTC m=+114.087099269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:09.875652 master-0 kubenswrapper[4059]: I0308 00:21:09.875562 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.875747 master-0 kubenswrapper[4059]: I0308 00:21:09.875715 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.875794 master-0 kubenswrapper[4059]: I0308 00:21:09.875757 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:09.875829 master-0 kubenswrapper[4059]: I0308 00:21:09.875789 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:09.875911 master-0 kubenswrapper[4059]: I0308 00:21:09.875867 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.875960 master-0 kubenswrapper[4059]: I0308 00:21:09.875928 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.876002 master-0 kubenswrapper[4059]: I0308 00:21:09.875960 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:09.876002 master-0 kubenswrapper[4059]: I0308 00:21:09.875991 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.876078 master-0 kubenswrapper[4059]: I0308 00:21:09.876025 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.876078 master-0 kubenswrapper[4059]: I0308 00:21:09.876059 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.876166 master-0 kubenswrapper[4059]: I0308 00:21:09.876089 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.876166 master-0 kubenswrapper[4059]: I0308 00:21:09.876120 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.876166 master-0 kubenswrapper[4059]: I0308 00:21:09.876153 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.876296 master-0 kubenswrapper[4059]: I0308 00:21:09.876184 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.876296 master-0 kubenswrapper[4059]: I0308 00:21:09.876246 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:09.876296 master-0 kubenswrapper[4059]: I0308 00:21:09.876281 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:09.876398 master-0 kubenswrapper[4059]: I0308 00:21:09.876315 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:09.876398 master-0 kubenswrapper[4059]: I0308 00:21:09.876347 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.876398 master-0 kubenswrapper[4059]: I0308 00:21:09.876376 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.876510 master-0 kubenswrapper[4059]: I0308 00:21:09.876410 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.876510 master-0 kubenswrapper[4059]: I0308 00:21:09.876442 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:09.876510 master-0 kubenswrapper[4059]: I0308 00:21:09.876485 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: I0308 00:21:09.876486 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: E0308 00:21:09.876734 4059 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: E0308 00:21:09.876787 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.376769136 +0000 UTC m=+114.088368658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: I0308 00:21:09.877481 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: I0308 00:21:09.877523 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: I0308 00:21:09.877536 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.877773 master-0 kubenswrapper[4059]: I0308 00:21:09.877574 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.878081 master-0 kubenswrapper[4059]: E0308 00:21:09.877900 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:09.878081 master-0 kubenswrapper[4059]: E0308 00:21:09.877941 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.377927542 +0000 UTC m=+114.089527064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:09.878395 master-0 kubenswrapper[4059]: E0308 00:21:09.878189 4059 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:09.879783 master-0 kubenswrapper[4059]: I0308 00:21:09.879739 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:09.880102 master-0 kubenswrapper[4059]: I0308 00:21:09.880059 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.880277 master-0 kubenswrapper[4059]: I0308 00:21:09.880115 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.880277 master-0 kubenswrapper[4059]: I0308 00:21:09.880175 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.880277 master-0 kubenswrapper[4059]: I0308 00:21:09.880251 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.880395 master-0 kubenswrapper[4059]: I0308 00:21:09.880293 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.880395 master-0 kubenswrapper[4059]: I0308 00:21:09.880382 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:09.880476 master-0 kubenswrapper[4059]: I0308 00:21:09.880424 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.880476 master-0 kubenswrapper[4059]: I0308 00:21:09.880463 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.880559 master-0 kubenswrapper[4059]: I0308 00:21:09.880490 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.880559 master-0 kubenswrapper[4059]: E0308 00:21:09.880494 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.38045767 +0000 UTC m=+114.092057212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:09.880559 master-0 kubenswrapper[4059]: I0308 00:21:09.880544 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.880695 master-0 kubenswrapper[4059]: I0308 00:21:09.880576 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.880695 master-0 kubenswrapper[4059]: I0308 00:21:09.880636 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.880695 master-0 kubenswrapper[4059]: I0308 00:21:09.880654 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.880862 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.880917 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.880954 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.880984 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881008 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881031 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881055 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881075 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881096 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881118 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881144 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881165 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881189 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881228 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.882091 master-0 kubenswrapper[4059]: I0308 00:21:09.881259 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.881253 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.881283 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.881391 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.881684 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.881972 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: E0308 00:21:09.882137 4059 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: E0308 00:21:09.882254 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.382224185 +0000 UTC m=+114.093823717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882141 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882291 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882335 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882379 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882418 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882466 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882499 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.882674 master-0 kubenswrapper[4059]: I0308 00:21:09.882532 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.883297 master-0 kubenswrapper[4059]: I0308 00:21:09.882563 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.883297 master-0 kubenswrapper[4059]: I0308 00:21:09.882595 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.883297 master-0 kubenswrapper[4059]: I0308 00:21:09.882633 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.883410 master-0 kubenswrapper[4059]: I0308 00:21:09.883310 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:09.883904 master-0 kubenswrapper[4059]: E0308 00:21:09.883777 4059 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:09.883904 master-0 kubenswrapper[4059]: E0308 00:21:09.883826 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.383812314 +0000 UTC m=+114.095411836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:09.883904 master-0 kubenswrapper[4059]: I0308 00:21:09.883874 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:09.885487 master-0 kubenswrapper[4059]: I0308 00:21:09.885455 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:09.885707 master-0 kubenswrapper[4059]: I0308 00:21:09.885677 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.885797 master-0 kubenswrapper[4059]: I0308 00:21:09.885767 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.885848 master-0 kubenswrapper[4059]: I0308 00:21:09.885814 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:09.886319 master-0 kubenswrapper[4059]: I0308 00:21:09.886291 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.886398 master-0 kubenswrapper[4059]: I0308 00:21:09.886337 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:09.886767 master-0 kubenswrapper[4059]: I0308 00:21:09.886686 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:09.886767 master-0 kubenswrapper[4059]: I0308 00:21:09.886749 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.888144 master-0 kubenswrapper[4059]: I0308 00:21:09.887502 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:09.888979 master-0 kubenswrapper[4059]: I0308 00:21:09.888946 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:09.889854 master-0 kubenswrapper[4059]: I0308 00:21:09.889763 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:09.889854 master-0 kubenswrapper[4059]: I0308 00:21:09.889826 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:09.983416 master-0 kubenswrapper[4059]: I0308 00:21:09.983349 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.983416 master-0 kubenswrapper[4059]: I0308 00:21:09.983404 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983548 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: I0308 00:21:09.983572 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983645 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.483620007 +0000 UTC m=+114.195219599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983664 4059 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983679 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: I0308 00:21:09.983716 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983725 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.48370746 +0000 UTC m=+114.195307092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:09.983756 master-0 kubenswrapper[4059]: E0308 00:21:09.983765 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.483754811 +0000 UTC m=+114.195354473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:09.984063 master-0 kubenswrapper[4059]: I0308 00:21:09.983781 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:09.984063 master-0 kubenswrapper[4059]: I0308 00:21:09.983827 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.984063 master-0 kubenswrapper[4059]: I0308 00:21:09.984005 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.984063 master-0 kubenswrapper[4059]: I0308 00:21:09.984034 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:09.984236 master-0 kubenswrapper[4059]: I0308 00:21:09.984068 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.984282 master-0 kubenswrapper[4059]: I0308 00:21:09.984247 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:09.984329 master-0 kubenswrapper[4059]: I0308 00:21:09.984316 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:09.984429 master-0 kubenswrapper[4059]: E0308 00:21:09.984396 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:09.984478 master-0 kubenswrapper[4059]: E0308 00:21:09.984439 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.484427862 +0000 UTC m=+114.196027374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:09.984521 master-0 kubenswrapper[4059]: I0308 00:21:09.984482 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:09.984550 master-0 kubenswrapper[4059]: I0308 00:21:09.984519 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.984550 master-0 kubenswrapper[4059]: I0308 00:21:09.984539 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.984690 master-0 kubenswrapper[4059]: E0308 00:21:09.984650 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:09.984765 master-0 kubenswrapper[4059]: E0308 00:21:09.984740 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:10.484716931 +0000 UTC m=+114.196316493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:09.984856 master-0 kubenswrapper[4059]: I0308 00:21:09.984817 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:09.986846 master-0 kubenswrapper[4059]: I0308 00:21:09.986800 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:09.989558 master-0 kubenswrapper[4059]: I0308 00:21:09.989527 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:10.209343 master-0 kubenswrapper[4059]: I0308 00:21:10.204332 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2"] Mar 08 00:21:10.209343 master-0 kubenswrapper[4059]: I0308 00:21:10.204628 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4"] Mar 08 00:21:10.215510 master-0 kubenswrapper[4059]: I0308 00:21:10.213721 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=3.213698387 podStartE2EDuration="3.213698387s" podCreationTimestamp="2026-03-08 00:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:10.211758926 +0000 UTC m=+113.923358448" watchObservedRunningTime="2026-03-08 00:21:10.213698387 +0000 UTC m=+113.925297909" Mar 08 00:21:10.215510 master-0 kubenswrapper[4059]: I0308 00:21:10.215375 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-27phk"] Mar 08 00:21:10.215510 master-0 kubenswrapper[4059]: I0308 00:21:10.215405 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2"] Mar 08 00:21:10.215510 master-0 kubenswrapper[4059]: I0308 00:21:10.215415 4059 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-rfnqf"] Mar 08 00:21:10.215874 master-0 kubenswrapper[4059]: I0308 00:21:10.215758 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.219290 master-0 kubenswrapper[4059]: I0308 00:21:10.217732 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx"] Mar 08 00:21:10.219290 master-0 kubenswrapper[4059]: I0308 00:21:10.218742 4059 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:21:10.226337 master-0 kubenswrapper[4059]: I0308 00:21:10.221713 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-blw5x"] Mar 08 00:21:10.226337 master-0 kubenswrapper[4059]: I0308 00:21:10.221766 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v"] Mar 08 00:21:10.226337 master-0 kubenswrapper[4059]: I0308 00:21:10.221778 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-gmvnl"] Mar 08 00:21:10.233564 master-0 kubenswrapper[4059]: I0308 00:21:10.233485 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886"] Mar 08 00:21:10.233564 master-0 kubenswrapper[4059]: I0308 00:21:10.233559 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9"] Mar 08 00:21:10.236713 master-0 kubenswrapper[4059]: I0308 00:21:10.236656 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:21:10.240073 master-0 kubenswrapper[4059]: I0308 00:21:10.240033 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq"] Mar 08 00:21:10.289649 master-0 kubenswrapper[4059]: I0308 00:21:10.289557 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.289865 master-0 kubenswrapper[4059]: I0308 00:21:10.289754 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.289945 master-0 kubenswrapper[4059]: I0308 00:21:10.289868 4059 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.302580 master-0 kubenswrapper[4059]: I0308 00:21:10.302500 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl"] Mar 08 00:21:10.303561 master-0 kubenswrapper[4059]: I0308 00:21:10.303478 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4"] Mar 08 00:21:10.390901 master-0 kubenswrapper[4059]: I0308 00:21:10.390844 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.391037 master-0 kubenswrapper[4059]: I0308 00:21:10.390913 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.391037 master-0 kubenswrapper[4059]: I0308 00:21:10.390912 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:10.391037 master-0 kubenswrapper[4059]: E0308 00:21:10.391005 4059 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:10.391037 master-0 kubenswrapper[4059]: I0308 00:21:10.391037 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:10.391265 master-0 kubenswrapper[4059]: E0308 00:21:10.391060 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391045692 +0000 UTC m=+115.102645214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:10.391265 master-0 kubenswrapper[4059]: E0308 00:21:10.391093 4059 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:10.391265 master-0 kubenswrapper[4059]: E0308 00:21:10.391120 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391112044 +0000 UTC m=+115.102711566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:10.391265 master-0 kubenswrapper[4059]: I0308 00:21:10.391123 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.391265 master-0 kubenswrapper[4059]: I0308 00:21:10.391184 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:10.391464 master-0 kubenswrapper[4059]: I0308 00:21:10.391271 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:10.391464 master-0 kubenswrapper[4059]: I0308 00:21:10.391303 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.391464 master-0 kubenswrapper[4059]: E0308 00:21:10.391383 4059 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:10.391464 master-0 kubenswrapper[4059]: E0308 00:21:10.391416 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391405873 +0000 UTC m=+115.103005395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:10.391464 master-0 kubenswrapper[4059]: E0308 00:21:10.391445 4059 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:10.391647 master-0 kubenswrapper[4059]: E0308 00:21:10.391475 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391465925 +0000 UTC m=+115.103065447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:10.391647 master-0 kubenswrapper[4059]: I0308 00:21:10.391533 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:10.391647 master-0 kubenswrapper[4059]: I0308 00:21:10.391593 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:10.391647 master-0 kubenswrapper[4059]: E0308 00:21:10.391646 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:10.391799 master-0 kubenswrapper[4059]: E0308 00:21:10.391667 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391661171 +0000 UTC m=+115.103260693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:10.391799 master-0 kubenswrapper[4059]: E0308 00:21:10.391766 4059 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:10.391799 master-0 kubenswrapper[4059]: E0308 00:21:10.391784 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.391778905 +0000 UTC m=+115.103378427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:10.392093 master-0 kubenswrapper[4059]: I0308 00:21:10.392057 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:10.476553 master-0 kubenswrapper[4059]: I0308 00:21:10.476391 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq"] Mar 08 00:21:10.476553 master-0 kubenswrapper[4059]: I0308 00:21:10.476493 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s"] Mar 08 00:21:10.476553 master-0 kubenswrapper[4059]: I0308 00:21:10.476506 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f"] Mar 08 00:21:10.477649 master-0 kubenswrapper[4059]: I0308 00:21:10.477592 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm"] Mar 08 00:21:10.497355 master-0 kubenswrapper[4059]: I0308 00:21:10.497283 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:10.497729 master-0 kubenswrapper[4059]: E0308 00:21:10.497445 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:10.497729 master-0 kubenswrapper[4059]: E0308 00:21:10.497539 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.497516851 +0000 UTC m=+115.209116443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:10.497729 master-0 kubenswrapper[4059]: I0308 00:21:10.497604 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:10.497729 master-0 kubenswrapper[4059]: I0308 00:21:10.497637 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:10.497729 master-0 kubenswrapper[4059]: I0308 00:21:10.497699 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:10.498054 master-0 kubenswrapper[4059]: I0308 00:21:10.497954 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:10.498125 master-0 kubenswrapper[4059]: E0308 00:21:10.498089 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:10.498125 master-0 kubenswrapper[4059]: E0308 00:21:10.498118 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.49810954 +0000 UTC m=+115.209709172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498161 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498186 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.498178612 +0000 UTC m=+115.209778144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498269 4059 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498298 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.498289315 +0000 UTC m=+115.209888837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498338 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:10.498445 master-0 kubenswrapper[4059]: E0308 00:21:10.498360 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:11.498353117 +0000 UTC m=+115.209952749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:10.501503 master-0 kubenswrapper[4059]: W0308 00:21:10.501458 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0a3c84_98bb_4915_9010_d66fcbeafe09.slice/crio-3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c WatchSource:0}: Error finding container 3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c: Status 404 returned error can't find the container with id 3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c Mar 08 00:21:10.515173 master-0 kubenswrapper[4059]: I0308 00:21:10.504436 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:10.515173 master-0 kubenswrapper[4059]: I0308 00:21:10.511758 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:10.525831 master-0 kubenswrapper[4059]: I0308 00:21:10.521749 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:10.525831 master-0 kubenswrapper[4059]: I0308 00:21:10.522439 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:10.525831 master-0 kubenswrapper[4059]: I0308 00:21:10.523453 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:10.526046 master-0 kubenswrapper[4059]: I0308 00:21:10.525938 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj"] Mar 08 00:21:10.526720 master-0 kubenswrapper[4059]: I0308 00:21:10.526493 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:10.526720 master-0 kubenswrapper[4059]: I0308 00:21:10.526502 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:10.527270 master-0 kubenswrapper[4059]: I0308 00:21:10.527179 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:10.529239 master-0 kubenswrapper[4059]: I0308 00:21:10.529136 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:10.529239 master-0 kubenswrapper[4059]: I0308 00:21:10.529211 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf"] Mar 08 00:21:10.529239 master-0 kubenswrapper[4059]: I0308 00:21:10.529235 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:10.529616 master-0 kubenswrapper[4059]: I0308 00:21:10.529570 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:10.530109 master-0 kubenswrapper[4059]: I0308 00:21:10.530050 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:10.530423 master-0 kubenswrapper[4059]: I0308 00:21:10.530372 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:10.530682 master-0 kubenswrapper[4059]: I0308 00:21:10.530615 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vnl28"] Mar 08 00:21:10.531857 master-0 kubenswrapper[4059]: I0308 00:21:10.531807 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk"] Mar 08 00:21:10.534609 master-0 kubenswrapper[4059]: I0308 00:21:10.534079 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:10.534609 master-0 kubenswrapper[4059]: I0308 00:21:10.534126 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:10.534609 master-0 kubenswrapper[4059]: I0308 00:21:10.534381 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:10.534609 master-0 kubenswrapper[4059]: I0308 00:21:10.534385 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:10.534609 master-0 kubenswrapper[4059]: I0308 00:21:10.534606 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:10.534975 master-0 kubenswrapper[4059]: I0308 00:21:10.534708 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:10.535492 master-0 kubenswrapper[4059]: I0308 00:21:10.535457 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:10.537444 master-0 kubenswrapper[4059]: I0308 00:21:10.537410 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:10.538144 master-0 kubenswrapper[4059]: I0308 00:21:10.538118 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:10.539162 master-0 kubenswrapper[4059]: I0308 00:21:10.539132 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:10.579753 master-0 kubenswrapper[4059]: I0308 00:21:10.579344 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:10.596596 master-0 kubenswrapper[4059]: I0308 00:21:10.596554 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:10.614374 master-0 kubenswrapper[4059]: I0308 00:21:10.614321 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:10.633968 master-0 kubenswrapper[4059]: I0308 00:21:10.633934 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:10.647487 master-0 kubenswrapper[4059]: I0308 00:21:10.647114 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:10.661850 master-0 kubenswrapper[4059]: I0308 00:21:10.661086 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:10.670329 master-0 kubenswrapper[4059]: I0308 00:21:10.668806 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:10.676255 master-0 kubenswrapper[4059]: I0308 00:21:10.674418 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:10.676945 master-0 kubenswrapper[4059]: I0308 00:21:10.676879 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c"} Mar 08 00:21:10.687938 master-0 kubenswrapper[4059]: I0308 00:21:10.687866 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:10.714942 master-0 kubenswrapper[4059]: I0308 00:21:10.714893 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:10.829241 master-0 kubenswrapper[4059]: I0308 00:21:10.828615 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:11.204674 master-0 kubenswrapper[4059]: I0308 00:21:11.204597 4059 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:11.408340 master-0 kubenswrapper[4059]: I0308 00:21:11.408149 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:11.408340 master-0 kubenswrapper[4059]: I0308 00:21:11.408226 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.408524 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.408684 4059 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.408721 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.408675085 +0000 UTC m=+117.120274787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: I0308 00:21:11.408790 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: I0308 00:21:11.408957 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.408960 4059 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.408982 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.408947334 +0000 UTC m=+117.120546886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: I0308 00:21:11.409122 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409159 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.40914456 +0000 UTC m=+117.120744122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409017 4059 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: I0308 00:21:11.409263 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409311 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.409276694 +0000 UTC m=+117.120876236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409317 4059 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409393 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.409370527 +0000 UTC m=+117.120970289 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:11.409859 master-0 kubenswrapper[4059]: E0308 00:21:11.409444 4059 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:11.410784 master-0 kubenswrapper[4059]: E0308 00:21:11.409512 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.409494331 +0000 UTC m=+117.121093893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:11.442386 master-0 kubenswrapper[4059]: I0308 00:21:11.442311 4059 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:11.460444 master-0 kubenswrapper[4059]: W0308 00:21:11.460179 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e52cbdc_1d46_4cc9_85ee_535aa449992f.slice/crio-e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662 WatchSource:0}: Error finding container e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662: Status 404 returned error can't find the container with id e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662 Mar 08 00:21:11.509984 master-0 kubenswrapper[4059]: I0308 00:21:11.509890 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:11.510265 master-0 kubenswrapper[4059]: E0308 00:21:11.510108 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:11.510265 master-0 kubenswrapper[4059]: I0308 00:21:11.510176 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:11.510265 master-0 kubenswrapper[4059]: E0308 00:21:11.510242 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.51017148 +0000 UTC m=+117.221771022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: I0308 00:21:11.510379 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: E0308 00:21:11.510422 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: I0308 00:21:11.510442 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: E0308 00:21:11.510465 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: E0308 00:21:11.510471 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.510456079 +0000 UTC m=+117.222055621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:11.510518 master-0 kubenswrapper[4059]: I0308 00:21:11.510512 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:11.510915 master-0 kubenswrapper[4059]: E0308 00:21:11.510548 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:11.510915 master-0 kubenswrapper[4059]: E0308 00:21:11.510552 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.510543592 +0000 UTC m=+117.222143114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:11.510915 master-0 kubenswrapper[4059]: E0308 00:21:11.510622 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.510606564 +0000 UTC m=+117.222206096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:11.510915 master-0 kubenswrapper[4059]: E0308 00:21:11.510721 4059 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:11.510915 master-0 kubenswrapper[4059]: E0308 00:21:11.510851 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:13.51082141 +0000 UTC m=+117.222421002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:11.683307 master-0 kubenswrapper[4059]: I0308 00:21:11.683145 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfnqf" event={"ID":"0e52cbdc-1d46-4cc9-85ee-535aa449992f","Type":"ContainerStarted","Data":"e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662"} Mar 08 00:21:11.907663 master-0 kubenswrapper[4059]: I0308 00:21:11.903637 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-27phk"] Mar 08 00:21:11.907663 master-0 kubenswrapper[4059]: I0308 00:21:11.906079 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2"] Mar 08 00:21:11.917575 master-0 kubenswrapper[4059]: I0308 00:21:11.915488 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl"] Mar 08 00:21:11.917575 master-0 kubenswrapper[4059]: I0308 00:21:11.915556 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4"] Mar 08 00:21:11.919309 master-0 kubenswrapper[4059]: I0308 00:21:11.917792 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk"] Mar 08 00:21:11.919714 master-0 kubenswrapper[4059]: I0308 00:21:11.919658 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx"] Mar 08 00:21:11.929142 master-0 kubenswrapper[4059]: I0308 00:21:11.928638 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886"] Mar 08 00:21:11.929142 master-0 kubenswrapper[4059]: I0308 00:21:11.928718 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq"] Mar 08 00:21:11.929142 master-0 kubenswrapper[4059]: I0308 00:21:11.928743 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq"] Mar 08 00:21:11.929142 master-0 kubenswrapper[4059]: I0308 00:21:11.928755 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vnl28"] Mar 08 00:21:11.929142 master-0 kubenswrapper[4059]: I0308 00:21:11.928766 4059 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4"] Mar 08 00:21:11.939235 master-0 kubenswrapper[4059]: W0308 00:21:11.939022 4059 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb164b32_e20e_4d07_a9ae_98720321621d.slice/crio-90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c WatchSource:0}: Error finding container 90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c: Status 404 returned error can't find the container with id 90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c Mar 08 00:21:12.688054 master-0 kubenswrapper[4059]: I0308 00:21:12.687697 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95"} Mar 08 00:21:12.688824 master-0 kubenswrapper[4059]: I0308 00:21:12.688706 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerStarted","Data":"5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848"} Mar 08 00:21:12.688824 master-0 kubenswrapper[4059]: I0308 00:21:12.688742 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerStarted","Data":"8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a"} Mar 08 00:21:12.690434 master-0 kubenswrapper[4059]: I0308 00:21:12.690411 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015"} Mar 08 00:21:12.691631 master-0 kubenswrapper[4059]: I0308 00:21:12.691411 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerStarted","Data":"16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.692391 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.693272 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerStarted","Data":"8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.694124 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerStarted","Data":"31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.695069 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.695812 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerStarted","Data":"7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.696518 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" event={"ID":"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8","Type":"ContainerStarted","Data":"c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834"} Mar 08 00:21:12.697341 master-0 kubenswrapper[4059]: I0308 00:21:12.697260 4059 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerStarted","Data":"90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c"} Mar 08 00:21:12.704538 master-0 kubenswrapper[4059]: I0308 00:21:12.703252 4059 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" podStartSLOduration=81.70324027 podStartE2EDuration="1m21.70324027s" podCreationTimestamp="2026-03-08 00:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:12.701511727 +0000 UTC m=+116.413111269" watchObservedRunningTime="2026-03-08 00:21:12.70324027 +0000 UTC m=+116.414839792" Mar 08 00:21:13.433156 master-0 kubenswrapper[4059]: I0308 00:21:13.432849 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:13.433156 master-0 kubenswrapper[4059]: I0308 00:21:13.432923 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:13.433156 master-0 kubenswrapper[4059]: E0308 00:21:13.433064 4059 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:13.433156 master-0 kubenswrapper[4059]: E0308 00:21:13.433082 4059 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:13.433156 master-0 kubenswrapper[4059]: E0308 00:21:13.433110 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433095335 +0000 UTC m=+121.144694857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433185 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433152507 +0000 UTC m=+121.144752069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: I0308 00:21:13.433249 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: I0308 00:21:13.433332 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: I0308 00:21:13.433368 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433392 4059 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433431 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433419645 +0000 UTC m=+121.145019247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433502 4059 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433545 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433534159 +0000 UTC m=+121.145133761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: I0308 00:21:13.433504 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433553 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433621 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433610991 +0000 UTC m=+121.145210593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433555 4059 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:13.434349 master-0 kubenswrapper[4059]: E0308 00:21:13.433660 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.433652043 +0000 UTC m=+121.145251675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:13.535645 master-0 kubenswrapper[4059]: I0308 00:21:13.535591 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:13.535645 master-0 kubenswrapper[4059]: I0308 00:21:13.535650 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:13.535892 master-0 kubenswrapper[4059]: I0308 00:21:13.535692 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:13.535892 master-0 kubenswrapper[4059]: I0308 00:21:13.535768 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:13.535892 master-0 kubenswrapper[4059]: I0308 00:21:13.535821 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:13.536107 master-0 kubenswrapper[4059]: E0308 00:21:13.535960 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:13.536107 master-0 kubenswrapper[4059]: E0308 00:21:13.536014 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.535997444 +0000 UTC m=+121.247596966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:13.536107 master-0 kubenswrapper[4059]: E0308 00:21:13.536069 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:13.536107 master-0 kubenswrapper[4059]: E0308 00:21:13.536094 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.536086077 +0000 UTC m=+121.247685609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536140 4059 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536163 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.536155969 +0000 UTC m=+121.247755491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536221 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536246 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.536238311 +0000 UTC m=+121.247837833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536288 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:13.536821 master-0 kubenswrapper[4059]: E0308 00:21:13.536311 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:17.536303483 +0000 UTC m=+121.247903005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:14.353072 master-0 kubenswrapper[4059]: I0308 00:21:14.352938 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:14.353072 master-0 kubenswrapper[4059]: E0308 00:21:14.353153 4059 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:14.353072 master-0 kubenswrapper[4059]: E0308 00:21:14.353229 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:22:18.353195934 +0000 UTC m=+182.064795456 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:17.448162 master-0 kubenswrapper[4059]: I0308 00:21:17.448109 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448336 4059 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: I0308 00:21:17.448354 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: I0308 00:21:17.448391 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448409 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.448390007 +0000 UTC m=+129.159989529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: I0308 00:21:17.448508 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448767 4059 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448833 4059 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448891 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.448855631 +0000 UTC m=+129.160455143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448947 4059 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: E0308 00:21:17.448988 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.448976005 +0000 UTC m=+129.160575527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:17.449014 master-0 kubenswrapper[4059]: I0308 00:21:17.449015 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: I0308 00:21:17.449039 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: E0308 00:21:17.449064 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.449051347 +0000 UTC m=+129.160650869 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: E0308 00:21:17.449103 4059 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: E0308 00:21:17.449131 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.449120359 +0000 UTC m=+129.160719881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: E0308 00:21:17.449167 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:17.449627 master-0 kubenswrapper[4059]: E0308 00:21:17.449189 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.449183411 +0000 UTC m=+129.160782933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:17.549654 master-0 kubenswrapper[4059]: I0308 00:21:17.549415 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:17.549654 master-0 kubenswrapper[4059]: E0308 00:21:17.549644 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: I0308 00:21:17.549626 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: E0308 00:21:17.549717 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.549698166 +0000 UTC m=+129.261297688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: E0308 00:21:17.549741 4059 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: I0308 00:21:17.549755 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: I0308 00:21:17.549778 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: E0308 00:21:17.549792 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.549777278 +0000 UTC m=+129.261376800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: I0308 00:21:17.549808 4059 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:17.549849 master-0 kubenswrapper[4059]: E0308 00:21:17.549835 4059 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:17.550120 master-0 kubenswrapper[4059]: E0308 00:21:17.549868 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.549857001 +0000 UTC m=+129.261456613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:17.550120 master-0 kubenswrapper[4059]: E0308 00:21:17.549871 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:17.550120 master-0 kubenswrapper[4059]: E0308 00:21:17.549918 4059 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:17.550120 master-0 kubenswrapper[4059]: E0308 00:21:17.549919 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.549910012 +0000 UTC m=+129.261509534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:17.550120 master-0 kubenswrapper[4059]: E0308 00:21:17.550075 4059 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.549968834 +0000 UTC m=+129.261568346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:23.593059 master-0 kubenswrapper[4059]: I0308 00:21:23.592685 4059 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:21:23.593066 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 00:21:23.612735 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 00:21:23.612996 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 00:21:23.614260 master-0 systemd[1]: kubelet.service: Consumed 9.110s CPU time. Mar 08 00:21:23.628182 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 00:21:23.775411 master-0 kubenswrapper[7479]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:21:23.776249 master-0 kubenswrapper[7479]: I0308 00:21:23.775493 7479 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 00:21:23.778069 master-0 kubenswrapper[7479]: W0308 00:21:23.778039 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778088 7479 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778094 7479 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778098 7479 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778102 7479 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778106 7479 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778115 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778120 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778124 7479 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:21:23.778122 master-0 kubenswrapper[7479]: W0308 00:21:23.778128 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778133 7479 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778137 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778141 7479 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778145 7479 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778148 7479 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778152 7479 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778156 7479 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778166 7479 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778170 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778178 7479 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778182 7479 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778186 7479 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778190 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778194 7479 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778212 7479 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778215 7479 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778220 7479 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778224 7479 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778228 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:21:23.778383 master-0 kubenswrapper[7479]: W0308 00:21:23.778232 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778236 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778242 7479 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778248 7479 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778254 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778258 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778264 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778268 7479 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778274 7479 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778279 7479 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778283 7479 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778287 7479 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778290 7479 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778297 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778302 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778305 7479 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778310 7479 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778314 7479 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778317 7479 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:21:23.778844 master-0 kubenswrapper[7479]: W0308 00:21:23.778321 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778325 7479 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778329 7479 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778333 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778336 7479 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778341 7479 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778349 7479 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778353 7479 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778357 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778361 7479 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778364 7479 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778368 7479 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778373 7479 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778377 7479 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778382 7479 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778386 7479 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778391 7479 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778395 7479 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778402 7479 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778406 7479 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:21:23.779430 master-0 kubenswrapper[7479]: W0308 00:21:23.778410 7479 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: W0308 00:21:23.778414 7479 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: W0308 00:21:23.778449 7479 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: W0308 00:21:23.778454 7479 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778591 7479 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778601 7479 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778613 7479 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778621 7479 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778627 7479 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778631 7479 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778637 7479 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778642 7479 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778647 7479 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778652 7479 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778716 7479 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778724 7479 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778728 7479 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778734 7479 flags.go:64] FLAG: --cgroup-root="" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778739 7479 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778745 7479 flags.go:64] FLAG: --client-ca-file="" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778750 7479 flags.go:64] FLAG: --cloud-config="" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778756 7479 flags.go:64] FLAG: --cloud-provider="" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778761 7479 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 00:21:23.779945 master-0 kubenswrapper[7479]: I0308 00:21:23.778800 7479 flags.go:64] FLAG: --cluster-domain="" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778805 7479 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778809 7479 flags.go:64] FLAG: --config-dir="" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778813 7479 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778818 7479 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778825 7479 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778829 7479 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778834 7479 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778875 7479 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.778990 7479 flags.go:64] FLAG: --contention-profiling="false" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779046 7479 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779076 7479 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779086 7479 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779094 7479 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779153 7479 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779162 7479 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779169 7479 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779223 7479 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779239 7479 flags.go:64] FLAG: --enable-server="true" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779247 7479 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779260 7479 flags.go:64] FLAG: --event-burst="100" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779268 7479 flags.go:64] FLAG: --event-qps="50" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779275 7479 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779282 7479 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779288 7479 flags.go:64] FLAG: --eviction-hard="" Mar 08 00:21:23.780530 master-0 kubenswrapper[7479]: I0308 00:21:23.779303 7479 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779311 7479 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779322 7479 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779329 7479 flags.go:64] FLAG: --eviction-soft="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779336 7479 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779374 7479 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779384 7479 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779390 7479 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779397 7479 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779403 7479 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779414 7479 flags.go:64] FLAG: --feature-gates="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779422 7479 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779429 7479 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779437 7479 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779443 7479 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779451 7479 flags.go:64] FLAG: --healthz-port="10248" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779458 7479 flags.go:64] FLAG: --help="false" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779465 7479 flags.go:64] FLAG: --hostname-override="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779471 7479 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779481 7479 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779488 7479 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779496 7479 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779503 7479 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779539 7479 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779547 7479 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 00:21:23.781170 master-0 kubenswrapper[7479]: I0308 00:21:23.779553 7479 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779560 7479 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779566 7479 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779577 7479 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779583 7479 flags.go:64] FLAG: --kube-reserved="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779590 7479 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779596 7479 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779602 7479 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779609 7479 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779615 7479 flags.go:64] FLAG: --lock-file="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779621 7479 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779631 7479 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779638 7479 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779649 7479 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779655 7479 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779662 7479 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779668 7479 flags.go:64] FLAG: --logging-format="text" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779675 7479 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779682 7479 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779688 7479 flags.go:64] FLAG: --manifest-url="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779698 7479 flags.go:64] FLAG: --manifest-url-header="" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779746 7479 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779755 7479 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779763 7479 flags.go:64] FLAG: --max-pods="110" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779770 7479 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 00:21:23.781735 master-0 kubenswrapper[7479]: I0308 00:21:23.779776 7479 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779783 7479 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779789 7479 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779800 7479 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779806 7479 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779813 7479 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779834 7479 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779842 7479 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779849 7479 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779859 7479 flags.go:64] FLAG: --pod-cidr="" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779866 7479 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779876 7479 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779882 7479 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779888 7479 flags.go:64] FLAG: --pods-per-core="0" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779894 7479 flags.go:64] FLAG: --port="10250" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779901 7479 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779907 7479 flags.go:64] FLAG: --provider-id="" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779914 7479 flags.go:64] FLAG: --qos-reserved="" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779924 7479 flags.go:64] FLAG: --read-only-port="10255" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779964 7479 flags.go:64] FLAG: --register-node="true" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779972 7479 flags.go:64] FLAG: --register-schedulable="true" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779978 7479 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779990 7479 flags.go:64] FLAG: --registry-burst="10" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.779997 7479 flags.go:64] FLAG: --registry-qps="5" Mar 08 00:21:23.782329 master-0 kubenswrapper[7479]: I0308 00:21:23.780003 7479 flags.go:64] FLAG: --reserved-cpus="" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780013 7479 flags.go:64] FLAG: --reserved-memory="" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780021 7479 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780028 7479 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780035 7479 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780041 7479 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780047 7479 flags.go:64] FLAG: --runonce="false" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780054 7479 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780060 7479 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780067 7479 flags.go:64] FLAG: --seccomp-default="false" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780077 7479 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780083 7479 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780089 7479 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780096 7479 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780104 7479 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780110 7479 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780118 7479 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780166 7479 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780174 7479 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780186 7479 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780193 7479 flags.go:64] FLAG: --system-cgroups="" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780213 7479 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780224 7479 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780230 7479 flags.go:64] FLAG: --tls-cert-file="" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780236 7479 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 00:21:23.782870 master-0 kubenswrapper[7479]: I0308 00:21:23.780248 7479 flags.go:64] FLAG: --tls-min-version="" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780255 7479 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780261 7479 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780267 7479 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780273 7479 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780280 7479 flags.go:64] FLAG: --v="2" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780288 7479 flags.go:64] FLAG: --version="false" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780297 7479 flags.go:64] FLAG: --vmodule="" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780308 7479 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: I0308 00:21:23.780315 7479 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780709 7479 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780722 7479 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780729 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780736 7479 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780742 7479 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780747 7479 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780753 7479 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780759 7479 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780764 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780770 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780775 7479 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780781 7479 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:21:23.783499 master-0 kubenswrapper[7479]: W0308 00:21:23.780786 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780796 7479 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780801 7479 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780806 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780812 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780817 7479 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780856 7479 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780862 7479 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780870 7479 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780877 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780882 7479 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780888 7479 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780893 7479 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780902 7479 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780908 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780915 7479 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780923 7479 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780930 7479 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780937 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:21:23.784103 master-0 kubenswrapper[7479]: W0308 00:21:23.780944 7479 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780951 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780957 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780964 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780970 7479 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780976 7479 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780986 7479 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.780991 7479 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781029 7479 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781036 7479 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781041 7479 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781047 7479 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781052 7479 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781058 7479 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781063 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781069 7479 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781075 7479 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781080 7479 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781085 7479 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781094 7479 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:21:23.784571 master-0 kubenswrapper[7479]: W0308 00:21:23.781100 7479 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781105 7479 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781111 7479 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781116 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781121 7479 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781128 7479 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781134 7479 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781141 7479 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781147 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781153 7479 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781160 7479 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781169 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781175 7479 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781181 7479 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781186 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781240 7479 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781247 7479 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781253 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781286 7479 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781292 7479 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:21:23.785069 master-0 kubenswrapper[7479]: W0308 00:21:23.781298 7479 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:21:23.785564 master-0 kubenswrapper[7479]: I0308 00:21:23.781307 7479 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:21:23.788951 master-0 kubenswrapper[7479]: I0308 00:21:23.788923 7479 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 00:21:23.789017 master-0 kubenswrapper[7479]: I0308 00:21:23.789008 7479 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 00:21:23.789135 master-0 kubenswrapper[7479]: W0308 00:21:23.789123 7479 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:21:23.789188 master-0 kubenswrapper[7479]: W0308 00:21:23.789180 7479 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:21:23.789257 master-0 kubenswrapper[7479]: W0308 00:21:23.789249 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:21:23.789304 master-0 kubenswrapper[7479]: W0308 00:21:23.789297 7479 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:21:23.789349 master-0 kubenswrapper[7479]: W0308 00:21:23.789341 7479 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:21:23.789393 master-0 kubenswrapper[7479]: W0308 00:21:23.789386 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:21:23.789433 master-0 kubenswrapper[7479]: W0308 00:21:23.789426 7479 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:21:23.789483 master-0 kubenswrapper[7479]: W0308 00:21:23.789475 7479 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:21:23.789529 master-0 kubenswrapper[7479]: W0308 00:21:23.789521 7479 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:21:23.789576 master-0 kubenswrapper[7479]: W0308 00:21:23.789569 7479 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:21:23.789619 master-0 kubenswrapper[7479]: W0308 00:21:23.789611 7479 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:21:23.789670 master-0 kubenswrapper[7479]: W0308 00:21:23.789663 7479 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:21:23.789715 master-0 kubenswrapper[7479]: W0308 00:21:23.789708 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:21:23.789760 master-0 kubenswrapper[7479]: W0308 00:21:23.789753 7479 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:21:23.789807 master-0 kubenswrapper[7479]: W0308 00:21:23.789800 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:21:23.789852 master-0 kubenswrapper[7479]: W0308 00:21:23.789845 7479 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:21:23.789896 master-0 kubenswrapper[7479]: W0308 00:21:23.789888 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:21:23.789936 master-0 kubenswrapper[7479]: W0308 00:21:23.789929 7479 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:21:23.789979 master-0 kubenswrapper[7479]: W0308 00:21:23.789972 7479 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:21:23.790030 master-0 kubenswrapper[7479]: W0308 00:21:23.790023 7479 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:21:23.790075 master-0 kubenswrapper[7479]: W0308 00:21:23.790068 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:21:23.790176 master-0 kubenswrapper[7479]: W0308 00:21:23.790168 7479 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:21:23.790249 master-0 kubenswrapper[7479]: W0308 00:21:23.790225 7479 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:21:23.790310 master-0 kubenswrapper[7479]: W0308 00:21:23.790302 7479 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:21:23.790352 master-0 kubenswrapper[7479]: W0308 00:21:23.790344 7479 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:21:23.790402 master-0 kubenswrapper[7479]: W0308 00:21:23.790394 7479 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:21:23.790496 master-0 kubenswrapper[7479]: W0308 00:21:23.790488 7479 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:21:23.790545 master-0 kubenswrapper[7479]: W0308 00:21:23.790538 7479 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:21:23.790590 master-0 kubenswrapper[7479]: W0308 00:21:23.790583 7479 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:21:23.790635 master-0 kubenswrapper[7479]: W0308 00:21:23.790627 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:21:23.790679 master-0 kubenswrapper[7479]: W0308 00:21:23.790672 7479 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:21:23.790725 master-0 kubenswrapper[7479]: W0308 00:21:23.790717 7479 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:21:23.790773 master-0 kubenswrapper[7479]: W0308 00:21:23.790766 7479 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:21:23.790815 master-0 kubenswrapper[7479]: W0308 00:21:23.790807 7479 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:21:23.790862 master-0 kubenswrapper[7479]: W0308 00:21:23.790855 7479 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:21:23.790912 master-0 kubenswrapper[7479]: W0308 00:21:23.790905 7479 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:21:23.790953 master-0 kubenswrapper[7479]: W0308 00:21:23.790946 7479 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:21:23.790998 master-0 kubenswrapper[7479]: W0308 00:21:23.790990 7479 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:21:23.791042 master-0 kubenswrapper[7479]: W0308 00:21:23.791035 7479 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:21:23.791087 master-0 kubenswrapper[7479]: W0308 00:21:23.791079 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:21:23.791131 master-0 kubenswrapper[7479]: W0308 00:21:23.791124 7479 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:21:23.791177 master-0 kubenswrapper[7479]: W0308 00:21:23.791170 7479 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:21:23.791237 master-0 kubenswrapper[7479]: W0308 00:21:23.791229 7479 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:21:23.791280 master-0 kubenswrapper[7479]: W0308 00:21:23.791273 7479 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:21:23.791320 master-0 kubenswrapper[7479]: W0308 00:21:23.791313 7479 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791366 7479 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791373 7479 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791376 7479 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791382 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791385 7479 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791389 7479 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791393 7479 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791397 7479 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791400 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791404 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791408 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791411 7479 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791415 7479 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791418 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791422 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791425 7479 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791429 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791433 7479 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791436 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:21:23.793809 master-0 kubenswrapper[7479]: W0308 00:21:23.791440 7479 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791443 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791447 7479 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791451 7479 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791454 7479 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791458 7479 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791462 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791466 7479 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: I0308 00:21:23.791472 7479 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791593 7479 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791599 7479 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791603 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791607 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791611 7479 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791615 7479 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:21:23.794409 master-0 kubenswrapper[7479]: W0308 00:21:23.791619 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791623 7479 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791627 7479 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791630 7479 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791634 7479 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791637 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791642 7479 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791646 7479 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791650 7479 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791654 7479 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791657 7479 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791661 7479 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791665 7479 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791668 7479 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791672 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791676 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791680 7479 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791683 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791687 7479 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791690 7479 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:21:23.794762 master-0 kubenswrapper[7479]: W0308 00:21:23.791694 7479 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791698 7479 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791701 7479 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791705 7479 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791709 7479 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791712 7479 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791716 7479 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791720 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791724 7479 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791727 7479 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791732 7479 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791736 7479 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791741 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791745 7479 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791749 7479 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791753 7479 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791757 7479 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791762 7479 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791765 7479 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791769 7479 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:21:23.795291 master-0 kubenswrapper[7479]: W0308 00:21:23.791773 7479 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791776 7479 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791780 7479 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791783 7479 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791787 7479 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791790 7479 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791794 7479 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791798 7479 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791802 7479 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791806 7479 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791810 7479 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791813 7479 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791817 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791821 7479 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791824 7479 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791828 7479 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791831 7479 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791835 7479 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791840 7479 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791844 7479 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:21:23.795781 master-0 kubenswrapper[7479]: W0308 00:21:23.791848 7479 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: W0308 00:21:23.791853 7479 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: W0308 00:21:23.791857 7479 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: W0308 00:21:23.791861 7479 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: W0308 00:21:23.791865 7479 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: W0308 00:21:23.791869 7479 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: I0308 00:21:23.791875 7479 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: I0308 00:21:23.792041 7479 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: I0308 00:21:23.793527 7479 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 00:21:23.796259 master-0 kubenswrapper[7479]: I0308 00:21:23.793599 7479 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.796638 7479 server.go:997] "Starting client certificate rotation" Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.796651 7479 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.796831 7479 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 21:34:19.09766309 +0000 UTC Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.796890 7479 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h12m55.300774894s for next certificate rotation Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.797184 7479 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:21:23.798463 master-0 kubenswrapper[7479]: I0308 00:21:23.798259 7479 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:21:23.800802 master-0 kubenswrapper[7479]: I0308 00:21:23.800777 7479 log.go:25] "Validated CRI v1 runtime API" Mar 08 00:21:23.802709 master-0 kubenswrapper[7479]: I0308 00:21:23.802698 7479 log.go:25] "Validated CRI v1 image API" Mar 08 00:21:23.803554 master-0 kubenswrapper[7479]: I0308 00:21:23.803524 7479 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 00:21:23.806158 master-0 kubenswrapper[7479]: I0308 00:21:23.806125 7479 fs.go:135] Filesystem UUIDs: map[39fc8acc-7a4c-4a2a-a305-ed25849d8805:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 00:21:23.806899 master-0 kubenswrapper[7479]: I0308 00:21:23.806149 7479 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm major:0 minor:214 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a/userdata/shm major:0 minor:47 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm major:0 minor:260 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm major:0 minor:92 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm major:0 minor:270 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc:{mountpoint:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn:{mountpoint:/var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn major:0 minor:274 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4:{mountpoint:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh:{mountpoint:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/32c19760-2cb2-4690-be8e-cba3c517c60e/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/32c19760-2cb2-4690-be8e-cba3c517c60e/volumes/kubernetes.io~projected/kube-api-access major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml:{mountpoint:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9:{mountpoint:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9:{mountpoint:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64:{mountpoint:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc:{mountpoint:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9:{mountpoint:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr:{mountpoint:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt:{mountpoint:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x:{mountpoint:/var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9:{mountpoint:/var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9 major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz:{mountpoint:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl:{mountpoint:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb:{mountpoint:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m:{mountpoint:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5:{mountpoint:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5 major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn:{mountpoint:/var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj:{mountpoint:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/volumes/kubernetes.io~projected/kube-api-access-fljc9:{mountpoint:/var/lib/kubelet/pods/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/volumes/kubernetes.io~projected/kube-api-access-fljc9 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5:{mountpoint:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5 major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz:{mountpoint:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz major:0 minor:91 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf:{mountpoint:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} overlay_0-100:{mountpoint:/var/lib/containers/storage/overlay/1d6841e836f2ca030fbe4707f0c45b8976755ac6da9c8ab6125ae7b4f50b5571/merged major:0 minor:100 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/40c14ef85373f3e6ce41db46b340eabdd8632d207053a144365fea1467e9f497/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/a69aff8900c82d8b87a54052ebd3ddd50adfa0fd532c1bf47d9993b17358ac71/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/6caac3bac4436bbd19a3fbc3a9e0414eba6b0e83129413cf63f2bcbd3a0b0c12/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/7f20b97a50fe58de9ca60734ae0f68a9d31077e4cdad0e3f8fd5762d0d7acd59/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/ee5ac5b49b2fc01de6e2817953974019ef102832ee570cf6a8907eaac81a7257/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/a9606da570bf7bc470010daacf00c94358ed6a5c0a3dabc879817fe291b847ae/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/2fcdc0fba11171fc0a45e0fc3205a3c6480a96ae9c0b723b0e8276e1fdaa3550/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/493d1ebd3aa61321335aecdb8f7621ba334eca73215c1b476883adf33c90ca3f/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f351e59d29694c1c618aa8fabfc5ca5dcdd2abf60e9cbc42f8b30e186ab1aa4f/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/ba5199ce4b7bea885870c83f6247d7990b382a3d3758f9291d5c3cd1e836ae93/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/76a75e52e32a86bc6b0cbe99f5756ea92bfcdc810601d4d2f2df8563231b1d0a/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/6ca277f9f39154392193b18ea0e2f89bd8df5cf180b9c411a3c98050a99f4ab4/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/2546d853e70dbb730e362dce63d0be7451df03183d86d74f09cee19c678cc782/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/1eaa7a0ac67d53479f098279e439d1e633f088d68966ad30c6611190e02057de/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/336b4e6785d0623172d86118f5404257b04a82615d08723fe286f1ac97a9069b/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/82c150753f44271a108ef61377e9af687118c4096ecd307c4461ec71abf50940/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/009a95d07e5675e796b37311f4e796f76ab20edd6ddead69ce2b5b6be06caba0/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/8dcbe071fe2e30ad6d3b67d17d46f78d6dca145a1a04aeae6d8bb8a229095b49/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/c724356aca1e8f5f9da95f0308e28e74519a2ddae2c7d654e2272b7233ab3a5b/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/3fafcdb8a90b24c3ecf9e183b62d1e882b4a782547351a2bb13ae53b5948d5b1/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/e4e97cf8e78293e4e1517301c44179463e10ad30a2615838979215e46ca4ad31/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-227:{mountpoint:/var/lib/containers/storage/overlay/8c81f9d4cc7a8b8b9d0ce5e446ba4ef8f5ead28044ad67f0c0487ebcc831a7d0/merged major:0 minor:227 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/590371c6acd7d9b35e6940c1fa3e224ea8b1f2415b86492f8a0c210bda289471/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/78fdfaad447a4c8cc0c1ab588df9cba56b432bf7f456c61db1904eee8cf7f5cf/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/898e9d7a038ebbdd876055c07b3313cdd824a48f79e70c24729b49592fe24cb9/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/0e9df9befa6dcc6396f6e84e5453bc78694609435cc9b5eb0bb5704a9aa677c0/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/96b17ea955812c320c4082272f8615ba756ef2a8a8613b82a765b99a568e4f33/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/848976b7e3bce10fd05acc59b0503244207440508a5fa13b758d58ebe94d7769/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/f0f93abd77eaed83efd65183a638b4dd15c4482efd8f9622c17ce1c8be9e3b27/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/e79c5ea5da9f2aafa3183e940ba72e1dd5ab17a148fc1429a23441784411b53c/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/ef13e7cfc28e06cd979113de900ddabc3d40fbbba7bb273ede1c73e6c2274db4/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/6ac13c9cbbec1989d5b7fed65fb097ef697af8ace91a1156831ac485a42b80fa/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/c535dc0b54f1b137c93969c2de24f6b3c2fa73b5094dc4442b7994052d9fb86c/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/9789645af9097533eece7d492c39db240e9660b27d08c1bfcb97828f272b7f21/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/f627aef58559b922600e12535b97f51238ee25d8e3921e9f221b9bc79d55bf40/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/f797e90aaf92e52afe82bbd0c41eb56461ff17b12299db05412b540544e632b9/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-49:{mountpoint:/var/lib/containers/storage/overlay/7fa00ef0225d6452471f1deefae4c909afc7f89384f4478e66474b840df6c7cb/merged major:0 minor:49 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/7633a69451e9d0ac39f306572615092b330eeee6a42acac8ffd439e6adfa81e1/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/81ea989874c5c8892e3ba4561df71c80f136cdf164486f7d563d739846d46f55/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/693ff179785d8bcb514b5cce1cd8fa72974066f2a2ebce268e2417e84a82d330/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/bd9f0085ae8870d81acc30e1b5e9329ebb06323cb36e0de994840984073333c1/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/e7dfa3024eecc60a4fc4b9dafb71c61d8bc907611fd30d7a939138a263e7372e/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/94c1a408079aa3ac4702c10e8ab07a97f7c3dafa74cf1c635cfabf7901139199/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/95537e882aaaa1d98d0b3cbe5a3e3dda8b5df7f09f7cdcd4dd1369d95e00e114/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/70b4fa37546e820aac8e98dd540afb46eb5d6098e7d868db5a903b61f64c6a9c/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/bbfc423ac6ab7da163cca65130c7888634de291dcc4b3016162a735e3100ea0d/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/f777c0d2d60e518d1f7ee023e74f5d3ec42d3b6918e6ecd00e91dfdb0d25a03b/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 08 00:21:23.833632 master-0 kubenswrapper[7479]: I0308 00:21:23.832728 7479 manager.go:217] Machine: {Timestamp:2026-03-08 00:21:23.831444447 +0000 UTC m=+0.144353384 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3fb2a1568fb24853b5e4190e9ed87031 SystemUUID:3fb2a156-8fb2-4853-b5e4-190e9ed87031 BootID:ae637101-d6c8-4837-b1bb-2909ed5c1c9d Filesystems:[{Device:overlay_0-227 DeviceMajor:0 DeviceMinor:227 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn DeviceMajor:0 DeviceMinor:274 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/32c19760-2cb2-4690-be8e-cba3c517c60e/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9 DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm DeviceMajor:0 DeviceMinor:214 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9 DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm DeviceMajor:0 DeviceMinor:92 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-100 DeviceMajor:0 DeviceMinor:100 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a/userdata/shm DeviceMajor:0 DeviceMinor:47 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-49 DeviceMajor:0 DeviceMinor:49 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5 DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/volumes/kubernetes.io~projected/kube-api-access-fljc9 DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm DeviceMajor:0 DeviceMinor:270 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm DeviceMajor:0 DeviceMinor:260 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9 DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz DeviceMajor:0 DeviceMinor:91 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5 DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:16a0ef8737c1e24 MacAddress:3e:38:c5:93:a5:16 Speed:10000 Mtu:8900} {Name:31406fc5b2c5472 MacAddress:52:38:09:1b:34:fb Speed:10000 Mtu:8900} {Name:3c8994f66c1270d MacAddress:c2:59:ba:f2:c5:74 Speed:10000 Mtu:8900} {Name:733e43352408d7f MacAddress:0e:2a:f7:48:43:31 Speed:10000 Mtu:8900} {Name:7bcc330c034a703 MacAddress:da:c0:90:17:ee:c0 Speed:10000 Mtu:8900} {Name:7f9bd3b95fa9a96 MacAddress:72:00:fc:c6:87:63 Speed:10000 Mtu:8900} {Name:813c8ed04b18f30 MacAddress:16:21:d9:74:39:8c Speed:10000 Mtu:8900} {Name:8493b96f9e2317b MacAddress:c6:29:de:1d:f9:78 Speed:10000 Mtu:8900} {Name:8f1055f3dc7c655 MacAddress:0e:11:fb:cf:46:63 Speed:10000 Mtu:8900} {Name:90c63e0b66f405a MacAddress:3e:0b:09:42:a5:9d Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:22:2c:d1:1b:4a:52 Speed:0 Mtu:8900} {Name:c9dc377ca2fdac8 MacAddress:aa:a9:a6:46:20:1d Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:0f:fb:26 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:73:5d:56 Speed:-1 Mtu:9000} {Name:f37ac8237d1707f MacAddress:a2:23:7e:7e:25:d9 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:ee:64:ec:05:bf:ed Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 00:21:23.833889 master-0 kubenswrapper[7479]: I0308 00:21:23.833658 7479 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 00:21:23.833889 master-0 kubenswrapper[7479]: I0308 00:21:23.833753 7479 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 00:21:23.834242 master-0 kubenswrapper[7479]: I0308 00:21:23.834167 7479 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 00:21:23.834611 master-0 kubenswrapper[7479]: I0308 00:21:23.834565 7479 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 00:21:23.835282 master-0 kubenswrapper[7479]: I0308 00:21:23.834642 7479 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 00:21:23.835347 master-0 kubenswrapper[7479]: I0308 00:21:23.835337 7479 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 00:21:23.835376 master-0 kubenswrapper[7479]: I0308 00:21:23.835348 7479 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 00:21:23.835376 master-0 kubenswrapper[7479]: I0308 00:21:23.835358 7479 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:21:23.835429 master-0 kubenswrapper[7479]: I0308 00:21:23.835378 7479 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:21:23.835640 master-0 kubenswrapper[7479]: I0308 00:21:23.835615 7479 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:21:23.835802 master-0 kubenswrapper[7479]: I0308 00:21:23.835744 7479 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 00:21:23.835944 master-0 kubenswrapper[7479]: I0308 00:21:23.835924 7479 kubelet.go:418] "Attempting to sync node with API server" Mar 08 00:21:23.835944 master-0 kubenswrapper[7479]: I0308 00:21:23.835941 7479 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 00:21:23.836003 master-0 kubenswrapper[7479]: I0308 00:21:23.835955 7479 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 00:21:23.836003 master-0 kubenswrapper[7479]: I0308 00:21:23.835966 7479 kubelet.go:324] "Adding apiserver pod source" Mar 08 00:21:23.836063 master-0 kubenswrapper[7479]: I0308 00:21:23.836022 7479 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 00:21:23.837651 master-0 kubenswrapper[7479]: I0308 00:21:23.837614 7479 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 00:21:23.837815 master-0 kubenswrapper[7479]: I0308 00:21:23.837792 7479 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 00:21:23.838153 master-0 kubenswrapper[7479]: I0308 00:21:23.838086 7479 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 00:21:23.838271 master-0 kubenswrapper[7479]: I0308 00:21:23.838244 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838272 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838283 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838300 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838309 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838317 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838326 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 00:21:23.838334 master-0 kubenswrapper[7479]: I0308 00:21:23.838333 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 00:21:23.838530 master-0 kubenswrapper[7479]: I0308 00:21:23.838343 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 00:21:23.838530 master-0 kubenswrapper[7479]: I0308 00:21:23.838356 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 00:21:23.838530 master-0 kubenswrapper[7479]: I0308 00:21:23.838379 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 00:21:23.838530 master-0 kubenswrapper[7479]: I0308 00:21:23.838391 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 00:21:23.838530 master-0 kubenswrapper[7479]: I0308 00:21:23.838415 7479 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 00:21:23.838773 master-0 kubenswrapper[7479]: I0308 00:21:23.838748 7479 server.go:1280] "Started kubelet" Mar 08 00:21:23.840029 master-0 kubenswrapper[7479]: I0308 00:21:23.839974 7479 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 00:21:23.840066 master-0 kubenswrapper[7479]: I0308 00:21:23.840038 7479 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 00:21:23.840264 master-0 kubenswrapper[7479]: I0308 00:21:23.840231 7479 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 00:21:23.840330 master-0 kubenswrapper[7479]: I0308 00:21:23.840312 7479 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 00:21:23.840325 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 00:21:23.846698 master-0 kubenswrapper[7479]: I0308 00:21:23.846498 7479 server.go:449] "Adding debug handlers to kubelet server" Mar 08 00:21:23.848865 master-0 kubenswrapper[7479]: I0308 00:21:23.848719 7479 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:21:23.849579 master-0 kubenswrapper[7479]: I0308 00:21:23.849049 7479 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 00:21:23.849579 master-0 kubenswrapper[7479]: I0308 00:21:23.849076 7479 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 00:21:23.849679 master-0 kubenswrapper[7479]: I0308 00:21:23.849641 7479 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 20:19:22.456431933 +0000 UTC Mar 08 00:21:23.849679 master-0 kubenswrapper[7479]: I0308 00:21:23.849674 7479 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h57m58.606759901s for next certificate rotation Mar 08 00:21:23.850373 master-0 kubenswrapper[7479]: I0308 00:21:23.850344 7479 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 00:21:23.850373 master-0 kubenswrapper[7479]: I0308 00:21:23.850363 7479 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 00:21:23.850450 master-0 kubenswrapper[7479]: I0308 00:21:23.850402 7479 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:21:23.850550 master-0 kubenswrapper[7479]: I0308 00:21:23.850527 7479 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 00:21:23.851691 master-0 kubenswrapper[7479]: I0308 00:21:23.851664 7479 factory.go:153] Registering CRI-O factory Mar 08 00:21:23.851691 master-0 kubenswrapper[7479]: I0308 00:21:23.851688 7479 factory.go:221] Registration of the crio container factory successfully Mar 08 00:21:23.851764 master-0 kubenswrapper[7479]: I0308 00:21:23.851750 7479 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 00:21:23.851764 master-0 kubenswrapper[7479]: I0308 00:21:23.851758 7479 factory.go:55] Registering systemd factory Mar 08 00:21:23.851764 master-0 kubenswrapper[7479]: I0308 00:21:23.851764 7479 factory.go:221] Registration of the systemd container factory successfully Mar 08 00:21:23.851879 master-0 kubenswrapper[7479]: I0308 00:21:23.851783 7479 factory.go:103] Registering Raw factory Mar 08 00:21:23.851879 master-0 kubenswrapper[7479]: I0308 00:21:23.851796 7479 manager.go:1196] Started watching for new ooms in manager Mar 08 00:21:23.852261 master-0 kubenswrapper[7479]: I0308 00:21:23.852238 7479 manager.go:319] Starting recovery of all containers Mar 08 00:21:23.858632 master-0 kubenswrapper[7479]: I0308 00:21:23.857439 7479 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865076 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865131 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865145 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865158 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865171 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865184 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865196 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865223 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="32c19760-2cb2-4690-be8e-cba3c517c60e" volumeName="kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865237 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865249 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865260 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865272 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865283 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865323 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865334 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865344 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865354 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865366 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865376 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865386 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865399 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865409 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865419 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865428 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865439 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865448 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865464 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d770808-d390-41c1-a9d9-fc12b99fa9a9" volumeName="kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865474 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865486 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec2d22f2-c260-42a6-a9da-ee0f44f42303" volumeName="kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865497 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865509 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865518 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865529 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865542 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865553 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865563 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865573 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" volumeName="kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865583 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e52cbdc-1d46-4cc9-85ee-535aa449992f" volumeName="kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865593 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865603 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865613 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ad37f40-c533-4a1e-882a-2e0973eff86d" volumeName="kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865623 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="32c19760-2cb2-4690-be8e-cba3c517c60e" volumeName="kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865633 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865645 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865655 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865666 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865676 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865687 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x" seLinuxMountContext="" Mar 08 00:21:23.865659 master-0 kubenswrapper[7479]: I0308 00:21:23.865698 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865710 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865730 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865742 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865760 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865772 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865792 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e52cbdc-1d46-4cc9-85ee-535aa449992f" volumeName="kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865805 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865817 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="815fd565-0609-4d8f-ac05-8656f198b008" volumeName="kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865831 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865878 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865893 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b94acad3-cf4e-443d-80fb-5e68a4074336" volumeName="kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865906 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec2d22f2-c260-42a6-a9da-ee0f44f42303" volumeName="kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865920 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865933 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865946 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d770808-d390-41c1-a9d9-fc12b99fa9a9" volumeName="kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865959 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f71fd39-a16b-47d2-b781-c8ce37bcb9b2" volumeName="kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865974 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865986 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.865997 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866009 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866020 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866032 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cf5a2ef-2498-40a0-a189-0753076fd3b6" volumeName="kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866045 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866057 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866072 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866084 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03f4bafb-c270-428a-bacf-8a424b3d1a05" volumeName="kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866097 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866108 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866121 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866135 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8" volumeName="kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866148 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866160 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866172 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866184 7479 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cf5a2ef-2498-40a0-a189-0753076fd3b6" volumeName="kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866195 7479 reconstruct.go:97] "Volume reconstruction finished" Mar 08 00:21:23.867864 master-0 kubenswrapper[7479]: I0308 00:21:23.866223 7479 reconciler.go:26] "Reconciler: start to sync state" Mar 08 00:21:23.869041 master-0 kubenswrapper[7479]: I0308 00:21:23.868653 7479 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 00:21:23.882149 master-0 kubenswrapper[7479]: I0308 00:21:23.880441 7479 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 00:21:23.883643 master-0 kubenswrapper[7479]: I0308 00:21:23.883621 7479 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 00:21:23.883689 master-0 kubenswrapper[7479]: I0308 00:21:23.883666 7479 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 00:21:23.883800 master-0 kubenswrapper[7479]: I0308 00:21:23.883765 7479 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 00:21:23.884012 master-0 kubenswrapper[7479]: E0308 00:21:23.883911 7479 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 00:21:23.885582 master-0 kubenswrapper[7479]: I0308 00:21:23.885561 7479 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:21:23.893730 master-0 kubenswrapper[7479]: I0308 00:21:23.893682 7479 generic.go:334] "Generic (PLEG): container finished" podID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerID="d6af0d3578bc6ae0d4e0f5d4dbddc52dc70217cef15e030aab47b2704363ffe2" exitCode=0 Mar 08 00:21:23.900087 master-0 kubenswrapper[7479]: I0308 00:21:23.900036 7479 generic.go:334] "Generic (PLEG): container finished" podID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerID="79807bacb8255c5e003178362fd0a6e9b3e5481074aa31458cc27f40ce6114ac" exitCode=0 Mar 08 00:21:23.914856 master-0 kubenswrapper[7479]: I0308 00:21:23.914814 7479 generic.go:334] "Generic (PLEG): container finished" podID="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" containerID="9caf746e34f3ceb9b7a0c15d058a8c3ef6549037b6840e762c5d26db1b3afa1f" exitCode=0 Mar 08 00:21:23.918265 master-0 kubenswrapper[7479]: I0308 00:21:23.918220 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="ee1bfab2130a9c72df8adc63c3382589fac2b085c9ce4752d92d10429ef61f76" exitCode=0 Mar 08 00:21:23.918265 master-0 kubenswrapper[7479]: I0308 00:21:23.918262 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="c7031bd4261187339ddcdbbf17642c8a944a5d40ae330e696f51959987e70da4" exitCode=0 Mar 08 00:21:23.918335 master-0 kubenswrapper[7479]: I0308 00:21:23.918273 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e" exitCode=0 Mar 08 00:21:23.918335 master-0 kubenswrapper[7479]: I0308 00:21:23.918282 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="7264af89c3bcf80c9a189b3bddcd203436764c691f9c5c52533e7f598dddfac4" exitCode=0 Mar 08 00:21:23.918335 master-0 kubenswrapper[7479]: I0308 00:21:23.918293 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="48c6a8c71ab87bd002a24ce7589e179bd20778d506e7cd037500b0c5771c655a" exitCode=0 Mar 08 00:21:23.918335 master-0 kubenswrapper[7479]: I0308 00:21:23.918302 7479 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="f8e210245fcf5757a0858988b80936bb56e15ab6a7c3881f301f7f4cb8a8f550" exitCode=0 Mar 08 00:21:23.920967 master-0 kubenswrapper[7479]: I0308 00:21:23.920937 7479 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0" exitCode=0 Mar 08 00:21:23.927300 master-0 kubenswrapper[7479]: I0308 00:21:23.927266 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 00:21:23.927629 master-0 kubenswrapper[7479]: I0308 00:21:23.927599 7479 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" exitCode=1 Mar 08 00:21:23.927629 master-0 kubenswrapper[7479]: I0308 00:21:23.927623 7479 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460" exitCode=0 Mar 08 00:21:23.970161 master-0 kubenswrapper[7479]: I0308 00:21:23.970105 7479 manager.go:324] Recovery completed Mar 08 00:21:23.984662 master-0 kubenswrapper[7479]: E0308 00:21:23.984576 7479 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:21:23.998450 master-0 kubenswrapper[7479]: I0308 00:21:23.998396 7479 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 00:21:23.998450 master-0 kubenswrapper[7479]: I0308 00:21:23.998429 7479 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 00:21:23.998450 master-0 kubenswrapper[7479]: I0308 00:21:23.998454 7479 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:21:23.998668 master-0 kubenswrapper[7479]: I0308 00:21:23.998599 7479 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 00:21:23.998668 master-0 kubenswrapper[7479]: I0308 00:21:23.998611 7479 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 00:21:23.998668 master-0 kubenswrapper[7479]: I0308 00:21:23.998633 7479 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 00:21:23.998668 master-0 kubenswrapper[7479]: I0308 00:21:23.998641 7479 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 00:21:23.998668 master-0 kubenswrapper[7479]: I0308 00:21:23.998649 7479 policy_none.go:49] "None policy: Start" Mar 08 00:21:24.000155 master-0 kubenswrapper[7479]: I0308 00:21:24.000108 7479 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 00:21:24.000258 master-0 kubenswrapper[7479]: I0308 00:21:24.000167 7479 state_mem.go:35] "Initializing new in-memory state store" Mar 08 00:21:24.000580 master-0 kubenswrapper[7479]: I0308 00:21:24.000533 7479 state_mem.go:75] "Updated machine memory state" Mar 08 00:21:24.000580 master-0 kubenswrapper[7479]: I0308 00:21:24.000559 7479 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 00:21:24.010273 master-0 kubenswrapper[7479]: I0308 00:21:24.010195 7479 manager.go:334] "Starting Device Plugin manager" Mar 08 00:21:24.010273 master-0 kubenswrapper[7479]: I0308 00:21:24.010245 7479 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 00:21:24.010273 master-0 kubenswrapper[7479]: I0308 00:21:24.010257 7479 server.go:79] "Starting device plugin registration server" Mar 08 00:21:24.010672 master-0 kubenswrapper[7479]: I0308 00:21:24.010637 7479 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 00:21:24.010796 master-0 kubenswrapper[7479]: I0308 00:21:24.010655 7479 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 00:21:24.010870 master-0 kubenswrapper[7479]: I0308 00:21:24.010815 7479 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 00:21:24.010936 master-0 kubenswrapper[7479]: I0308 00:21:24.010892 7479 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 00:21:24.010936 master-0 kubenswrapper[7479]: I0308 00:21:24.010900 7479 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 00:21:24.112818 master-0 kubenswrapper[7479]: I0308 00:21:24.112352 7479 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:21:24.114926 master-0 kubenswrapper[7479]: I0308 00:21:24.114872 7479 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:21:24.114926 master-0 kubenswrapper[7479]: I0308 00:21:24.114917 7479 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:21:24.114926 master-0 kubenswrapper[7479]: I0308 00:21:24.114931 7479 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:21:24.115148 master-0 kubenswrapper[7479]: I0308 00:21:24.114989 7479 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:21:24.123342 master-0 kubenswrapper[7479]: I0308 00:21:24.123300 7479 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 00:21:24.123645 master-0 kubenswrapper[7479]: I0308 00:21:24.123397 7479 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 00:21:24.185177 master-0 kubenswrapper[7479]: I0308 00:21:24.185054 7479 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 00:21:24.185905 master-0 kubenswrapper[7479]: I0308 00:21:24.185858 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e" Mar 08 00:21:24.185990 master-0 kubenswrapper[7479]: I0308 00:21:24.185898 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0"} Mar 08 00:21:24.185990 master-0 kubenswrapper[7479]: I0308 00:21:24.185944 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4"} Mar 08 00:21:24.185990 master-0 kubenswrapper[7479]: I0308 00:21:24.185954 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d"} Mar 08 00:21:24.185990 master-0 kubenswrapper[7479]: I0308 00:21:24.185970 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858860976eccd2c9ae8be3e9bcc229880ee4eb3f7d6a26e66c0b63208465cc57" Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186008 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186018 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186027 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186036 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186046 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186057 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186066 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"aa5ad4a36fb34e3b8448dce44870bd90294e9dfdbc77705a2449657049d35017"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186078 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186087 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186096 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186108 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"b999c6f84ef35141ea9d9157df896d14bb08340f5b7476591f3ed6362f2a6196"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186117 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"da60beba23659d143e9020dc0409825d88a4d10b35b445c12b13ae8fc1310bdf"} Mar 08 00:21:24.186269 master-0 kubenswrapper[7479]: I0308 00:21:24.186125 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d2f5b57940c224986a9226bf1c006a72c2663c4293ddb4cdc327ea534c8cbcb7"} Mar 08 00:21:24.198404 master-0 kubenswrapper[7479]: E0308 00:21:24.198257 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.199114 master-0 kubenswrapper[7479]: W0308 00:21:24.199046 7479 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 00:21:24.199114 master-0 kubenswrapper[7479]: E0308 00:21:24.199104 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.199329 master-0 kubenswrapper[7479]: E0308 00:21:24.199194 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.199389 master-0 kubenswrapper[7479]: E0308 00:21:24.199335 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.199389 master-0 kubenswrapper[7479]: E0308 00:21:24.199355 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.272900 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.272970 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.273013 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.273044 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.273075 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.273158 master-0 kubenswrapper[7479]: I0308 00:21:24.273158 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273219 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273241 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273258 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273286 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273302 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273318 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273332 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273348 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273362 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273376 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.273812 master-0 kubenswrapper[7479]: I0308 00:21:24.273397 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373749 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373790 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373813 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373829 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373845 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373860 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373875 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373891 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373904 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.373896 master-0 kubenswrapper[7479]: I0308 00:21:24.373918 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.373933 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.373950 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.373965 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.373979 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.373991 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374004 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374018 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374069 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374112 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374132 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374152 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374172 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374192 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374229 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374247 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374267 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374287 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374316 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374337 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374356 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374375 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374392 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374413 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.374669 master-0 kubenswrapper[7479]: I0308 00:21:24.374432 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:24.836748 master-0 kubenswrapper[7479]: I0308 00:21:24.836650 7479 apiserver.go:52] "Watching apiserver" Mar 08 00:21:24.845731 master-0 kubenswrapper[7479]: I0308 00:21:24.845693 7479 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:21:24.846904 master-0 kubenswrapper[7479]: I0308 00:21:24.846833 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2","openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v","openshift-network-operator/iptables-alerter-rfnqf","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2","openshift-ovn-kubernetes/ovnkube-node-2w9mf","kube-system/bootstrap-kube-scheduler-master-0","openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9","openshift-ingress-operator/ingress-operator-677db989d6-blw5x","openshift-multus/network-metrics-daemon-krv7c","openshift-network-node-identity/network-node-identity-m7549","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf","openshift-dns-operator/dns-operator-589895fbb7-gmvnl","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4","openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2","openshift-network-diagnostics/network-check-target-w5fjg","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886","assisted-installer/assisted-installer-controller-v949k","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq","openshift-config-operator/openshift-config-operator-64488f9d78-vnl28","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s","openshift-multus/multus-additional-cni-plugins-d5jxb","openshift-etcd-operator/etcd-operator-5884b9cd56-27phk","openshift-multus/multus-admission-controller-8d675b596-jgdmb","openshift-multus/multus-dllkj","openshift-network-operator/network-operator-7c649bf6d4-st2sr","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj","openshift-etcd/etcd-master-0-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f"] Mar 08 00:21:24.848220 master-0 kubenswrapper[7479]: I0308 00:21:24.848145 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.849270 master-0 kubenswrapper[7479]: I0308 00:21:24.849231 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.849358 master-0 kubenswrapper[7479]: I0308 00:21:24.849286 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:24.849358 master-0 kubenswrapper[7479]: I0308 00:21:24.849341 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:21:24.850154 master-0 kubenswrapper[7479]: I0308 00:21:24.850114 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.851196 master-0 kubenswrapper[7479]: I0308 00:21:24.851165 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:21:24.852590 master-0 kubenswrapper[7479]: I0308 00:21:24.852534 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.864311 master-0 kubenswrapper[7479]: I0308 00:21:24.864265 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:21:24.864728 master-0 kubenswrapper[7479]: I0308 00:21:24.864689 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:24.865903 master-0 kubenswrapper[7479]: I0308 00:21:24.865874 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:24.866350 master-0 kubenswrapper[7479]: I0308 00:21:24.865976 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:21:24.866350 master-0 kubenswrapper[7479]: I0308 00:21:24.866121 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:21:24.866509 master-0 kubenswrapper[7479]: I0308 00:21:24.866153 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:21:24.866509 master-0 kubenswrapper[7479]: I0308 00:21:24.866190 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.866509 master-0 kubenswrapper[7479]: I0308 00:21:24.866243 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:21:24.867124 master-0 kubenswrapper[7479]: I0308 00:21:24.867092 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:21:24.867534 master-0 kubenswrapper[7479]: I0308 00:21:24.867501 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:21:24.873047 master-0 kubenswrapper[7479]: I0308 00:21:24.873007 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:21:24.873047 master-0 kubenswrapper[7479]: I0308 00:21:24.873045 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:24.873286 master-0 kubenswrapper[7479]: I0308 00:21:24.873259 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:21:24.873505 master-0 kubenswrapper[7479]: I0308 00:21:24.873471 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:21:24.873898 master-0 kubenswrapper[7479]: I0308 00:21:24.873871 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:21:24.873981 master-0 kubenswrapper[7479]: I0308 00:21:24.873951 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:21:24.874048 master-0 kubenswrapper[7479]: I0308 00:21:24.873951 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:21:24.874048 master-0 kubenswrapper[7479]: I0308 00:21:24.874025 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:21:24.874509 master-0 kubenswrapper[7479]: I0308 00:21:24.874479 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:21:24.874619 master-0 kubenswrapper[7479]: I0308 00:21:24.874521 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:21:24.874677 master-0 kubenswrapper[7479]: I0308 00:21:24.874670 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:21:24.874738 master-0 kubenswrapper[7479]: I0308 00:21:24.874683 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:24.874875 master-0 kubenswrapper[7479]: I0308 00:21:24.874832 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:21:24.874977 master-0 kubenswrapper[7479]: I0308 00:21:24.874953 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.874977 master-0 kubenswrapper[7479]: I0308 00:21:24.874974 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:21:24.875106 master-0 kubenswrapper[7479]: I0308 00:21:24.874977 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:21:24.875106 master-0 kubenswrapper[7479]: I0308 00:21:24.875037 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:21:24.875106 master-0 kubenswrapper[7479]: I0308 00:21:24.875082 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.875106 master-0 kubenswrapper[7479]: I0308 00:21:24.875084 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.875354 master-0 kubenswrapper[7479]: I0308 00:21:24.875121 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:21:24.875354 master-0 kubenswrapper[7479]: I0308 00:21:24.875121 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.875354 master-0 kubenswrapper[7479]: I0308 00:21:24.875122 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.875354 master-0 kubenswrapper[7479]: I0308 00:21:24.875242 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:21:24.875354 master-0 kubenswrapper[7479]: I0308 00:21:24.875270 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:21:24.875615 master-0 kubenswrapper[7479]: I0308 00:21:24.875475 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:24.875730 master-0 kubenswrapper[7479]: I0308 00:21:24.875702 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:21:24.876021 master-0 kubenswrapper[7479]: I0308 00:21:24.875987 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:21:24.876548 master-0 kubenswrapper[7479]: I0308 00:21:24.876514 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:21:24.882329 master-0 kubenswrapper[7479]: I0308 00:21:24.878682 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.882606 master-0 kubenswrapper[7479]: I0308 00:21:24.876874 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:21:24.882829 master-0 kubenswrapper[7479]: I0308 00:21:24.878574 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:21:24.883031 master-0 kubenswrapper[7479]: I0308 00:21:24.879165 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:21:24.883266 master-0 kubenswrapper[7479]: I0308 00:21:24.879251 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:21:24.883373 master-0 kubenswrapper[7479]: I0308 00:21:24.879322 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:21:24.883433 master-0 kubenswrapper[7479]: I0308 00:21:24.879370 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.883496 master-0 kubenswrapper[7479]: I0308 00:21:24.879429 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:21:24.883557 master-0 kubenswrapper[7479]: I0308 00:21:24.879705 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:21:24.883557 master-0 kubenswrapper[7479]: I0308 00:21:24.879828 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:21:24.883663 master-0 kubenswrapper[7479]: I0308 00:21:24.880316 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:21:24.886174 master-0 kubenswrapper[7479]: I0308 00:21:24.885425 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.888576 master-0 kubenswrapper[7479]: I0308 00:21:24.888504 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:24.888721 master-0 kubenswrapper[7479]: I0308 00:21:24.888675 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:24.888721 master-0 kubenswrapper[7479]: I0308 00:21:24.888700 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.889049 master-0 kubenswrapper[7479]: I0308 00:21:24.888861 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:24.889049 master-0 kubenswrapper[7479]: I0308 00:21:24.888892 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.889430 master-0 kubenswrapper[7479]: I0308 00:21:24.888916 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:24.889558 master-0 kubenswrapper[7479]: I0308 00:21:24.889438 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.889558 master-0 kubenswrapper[7479]: I0308 00:21:24.888933 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:24.889558 master-0 kubenswrapper[7479]: I0308 00:21:24.889290 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:24.890108 master-0 kubenswrapper[7479]: I0308 00:21:24.889146 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:24.890108 master-0 kubenswrapper[7479]: I0308 00:21:24.889662 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.890108 master-0 kubenswrapper[7479]: I0308 00:21:24.889692 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.890108 master-0 kubenswrapper[7479]: I0308 00:21:24.889941 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.890338 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.890443 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.890795 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.890962 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.891341 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.891398 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:24.891866 master-0 kubenswrapper[7479]: I0308 00:21:24.891622 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.891830 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.891979 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892012 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892257 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892317 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892429 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892457 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892506 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892507 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.892771 master-0 kubenswrapper[7479]: I0308 00:21:24.892600 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892837 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892863 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892890 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892908 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892924 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892939 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892954 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892972 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892967 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.892988 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893052 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893100 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893138 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893176 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893353 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893359 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893373 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893440 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893531 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893549 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893558 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893608 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893623 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893651 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893689 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893695 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893725 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893767 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893788 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893805 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893944 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.893967 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894013 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894058 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894099 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894118 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894230 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894247 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.894174 master-0 kubenswrapper[7479]: I0308 00:21:24.894258 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.894303 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.895183 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.895226 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.895317 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.895654 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.895917 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896105 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896300 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896600 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896716 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896843 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.896917 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.897056 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.897116 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.897183 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:21:24.898986 master-0 kubenswrapper[7479]: I0308 00:21:24.897274 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.902567 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.903026 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.903799 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.904224 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.904335 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.904421 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.904639 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.904902 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905028 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905229 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905269 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905468 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905698 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.905964 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.906164 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.906561 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 00:21:24.910363 master-0 kubenswrapper[7479]: I0308 00:21:24.906925 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:21:24.911756 master-0 kubenswrapper[7479]: I0308 00:21:24.911717 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:21:24.911901 master-0 kubenswrapper[7479]: I0308 00:21:24.911760 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 00:21:24.911901 master-0 kubenswrapper[7479]: I0308 00:21:24.911738 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:21:24.911975 master-0 kubenswrapper[7479]: I0308 00:21:24.911944 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912004 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912076 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912099 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912046 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912170 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912187 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912187 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912046 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:21:24.912311 master-0 kubenswrapper[7479]: I0308 00:21:24.912269 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:21:24.912861 master-0 kubenswrapper[7479]: I0308 00:21:24.912315 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:21:24.912861 master-0 kubenswrapper[7479]: I0308 00:21:24.912448 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 00:21:24.912861 master-0 kubenswrapper[7479]: I0308 00:21:24.912725 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 00:21:24.921849 master-0 kubenswrapper[7479]: I0308 00:21:24.921808 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:21:24.921971 master-0 kubenswrapper[7479]: I0308 00:21:24.921863 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:21:24.922100 master-0 kubenswrapper[7479]: I0308 00:21:24.922041 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:21:24.922748 master-0 kubenswrapper[7479]: I0308 00:21:24.922716 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.924309 master-0 kubenswrapper[7479]: I0308 00:21:24.924272 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.924667 master-0 kubenswrapper[7479]: I0308 00:21:24.924647 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:21:24.926624 master-0 kubenswrapper[7479]: I0308 00:21:24.926588 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 00:21:24.938936 master-0 kubenswrapper[7479]: I0308 00:21:24.938896 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:21:24.953223 master-0 kubenswrapper[7479]: I0308 00:21:24.953172 7479 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 00:21:24.972919 master-0 kubenswrapper[7479]: I0308 00:21:24.972871 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:21:24.989658 master-0 kubenswrapper[7479]: I0308 00:21:24.989589 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.995455 master-0 kubenswrapper[7479]: I0308 00:21:24.995404 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.995523 master-0 kubenswrapper[7479]: I0308 00:21:24.995469 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:24.995523 master-0 kubenswrapper[7479]: I0308 00:21:24.995511 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:24.995583 master-0 kubenswrapper[7479]: I0308 00:21:24.995544 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:24.995615 master-0 kubenswrapper[7479]: I0308 00:21:24.995596 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.995646 master-0 kubenswrapper[7479]: I0308 00:21:24.995630 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:24.995703 master-0 kubenswrapper[7479]: I0308 00:21:24.995674 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.995703 master-0 kubenswrapper[7479]: I0308 00:21:24.995694 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:24.995764 master-0 kubenswrapper[7479]: E0308 00:21:24.995713 7479 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:24.995795 master-0 kubenswrapper[7479]: I0308 00:21:24.995776 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.995836 master-0 kubenswrapper[7479]: I0308 00:21:24.995815 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:24.995932 master-0 kubenswrapper[7479]: I0308 00:21:24.995902 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:24.996034 master-0 kubenswrapper[7479]: E0308 00:21:24.995937 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.49590757 +0000 UTC m=+1.808816487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:24.996034 master-0 kubenswrapper[7479]: I0308 00:21:24.995979 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.996034 master-0 kubenswrapper[7479]: I0308 00:21:24.996013 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.996122 master-0 kubenswrapper[7479]: I0308 00:21:24.996040 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.996122 master-0 kubenswrapper[7479]: I0308 00:21:24.996073 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.996174 master-0 kubenswrapper[7479]: I0308 00:21:24.996153 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.996219 master-0 kubenswrapper[7479]: I0308 00:21:24.996178 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:24.996219 master-0 kubenswrapper[7479]: I0308 00:21:24.996197 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.996270 master-0 kubenswrapper[7479]: I0308 00:21:24.996229 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:24.996348 master-0 kubenswrapper[7479]: I0308 00:21:24.996286 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.996385 master-0 kubenswrapper[7479]: I0308 00:21:24.996291 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:24.996537 master-0 kubenswrapper[7479]: I0308 00:21:24.996503 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.996537 master-0 kubenswrapper[7479]: I0308 00:21:24.996517 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:24.996595 master-0 kubenswrapper[7479]: I0308 00:21:24.996541 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.996595 master-0 kubenswrapper[7479]: I0308 00:21:24.996549 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:24.996595 master-0 kubenswrapper[7479]: I0308 00:21:24.996563 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.996683 master-0 kubenswrapper[7479]: I0308 00:21:24.996570 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:24.996683 master-0 kubenswrapper[7479]: I0308 00:21:24.996618 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:24.996683 master-0 kubenswrapper[7479]: I0308 00:21:24.996643 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.996763 master-0 kubenswrapper[7479]: I0308 00:21:24.996708 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.996791 master-0 kubenswrapper[7479]: I0308 00:21:24.996758 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.996819 master-0 kubenswrapper[7479]: I0308 00:21:24.996785 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:24.996819 master-0 kubenswrapper[7479]: I0308 00:21:24.996814 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:24.996870 master-0 kubenswrapper[7479]: I0308 00:21:24.996833 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.996870 master-0 kubenswrapper[7479]: I0308 00:21:24.996851 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.996924 master-0 kubenswrapper[7479]: I0308 00:21:24.996869 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.996924 master-0 kubenswrapper[7479]: I0308 00:21:24.996889 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:24.996924 master-0 kubenswrapper[7479]: I0308 00:21:24.996868 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.996924 master-0 kubenswrapper[7479]: I0308 00:21:24.996907 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.996924 master-0 kubenswrapper[7479]: I0308 00:21:24.996913 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:24.997084 master-0 kubenswrapper[7479]: I0308 00:21:24.996932 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:24.997084 master-0 kubenswrapper[7479]: I0308 00:21:24.997029 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:24.997084 master-0 kubenswrapper[7479]: I0308 00:21:24.997070 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.997168 master-0 kubenswrapper[7479]: I0308 00:21:24.997097 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.997168 master-0 kubenswrapper[7479]: I0308 00:21:24.997121 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.997168 master-0 kubenswrapper[7479]: I0308 00:21:24.997145 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:24.997168 master-0 kubenswrapper[7479]: I0308 00:21:24.997160 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997170 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997220 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997232 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997256 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997264 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: E0308 00:21:24.997276 7479 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997259 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: E0308 00:21:24.997316 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.497302083 +0000 UTC m=+1.810210990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:24.997360 master-0 kubenswrapper[7479]: I0308 00:21:24.997336 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:24.997593 master-0 kubenswrapper[7479]: I0308 00:21:24.997378 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.997593 master-0 kubenswrapper[7479]: I0308 00:21:24.997416 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.997593 master-0 kubenswrapper[7479]: I0308 00:21:24.997453 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.997593 master-0 kubenswrapper[7479]: I0308 00:21:24.997482 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.997695 master-0 kubenswrapper[7479]: I0308 00:21:24.997606 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:24.997781 master-0 kubenswrapper[7479]: I0308 00:21:24.997619 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.997924 master-0 kubenswrapper[7479]: I0308 00:21:24.997630 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.998083 master-0 kubenswrapper[7479]: I0308 00:21:24.998017 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:24.998250 master-0 kubenswrapper[7479]: I0308 00:21:24.998217 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.998348 master-0 kubenswrapper[7479]: I0308 00:21:24.998311 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.998414 master-0 kubenswrapper[7479]: I0308 00:21:24.998392 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:24.998619 master-0 kubenswrapper[7479]: I0308 00:21:24.998587 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:24.998619 master-0 kubenswrapper[7479]: I0308 00:21:24.998605 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:24.998698 master-0 kubenswrapper[7479]: I0308 00:21:24.998591 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:24.998845 master-0 kubenswrapper[7479]: I0308 00:21:24.998814 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:24.998888 master-0 kubenswrapper[7479]: I0308 00:21:24.998859 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.998937 master-0 kubenswrapper[7479]: I0308 00:21:24.998919 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:24.998974 master-0 kubenswrapper[7479]: I0308 00:21:24.998948 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:24.998974 master-0 kubenswrapper[7479]: I0308 00:21:24.998967 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.998991 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.999009 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.999027 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.999043 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.999066 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999084 master-0 kubenswrapper[7479]: I0308 00:21:24.999082 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999605 master-0 kubenswrapper[7479]: I0308 00:21:24.999100 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:24.999700 master-0 kubenswrapper[7479]: I0308 00:21:24.999652 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:24.999700 master-0 kubenswrapper[7479]: I0308 00:21:24.999681 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:24.999786 master-0 kubenswrapper[7479]: I0308 00:21:24.999712 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999786 master-0 kubenswrapper[7479]: I0308 00:21:24.999732 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:24.999786 master-0 kubenswrapper[7479]: I0308 00:21:24.999749 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999786 master-0 kubenswrapper[7479]: I0308 00:21:24.999774 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999790 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999809 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999822 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999841 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999857 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999889 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999906 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999921 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999938 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:24.999975 master-0 kubenswrapper[7479]: I0308 00:21:24.999960 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:24.999984 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000007 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000031 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000051 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000077 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000094 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000116 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000138 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000155 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000171 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000186 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.000302 master-0 kubenswrapper[7479]: I0308 00:21:25.000248 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000275 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000443 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000521 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000527 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000543 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000597 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000657 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.000696 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001001 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: E0308 00:21:25.001012 7479 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001058 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001169 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001182 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: E0308 00:21:25.001233 7479 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: E0308 00:21:25.001268 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.501255696 +0000 UTC m=+1.814164613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: E0308 00:21:25.001300 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.501294827 +0000 UTC m=+1.814203744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001376 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001385 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:25.001771 master-0 kubenswrapper[7479]: I0308 00:21:25.001491 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.014440 master-0 kubenswrapper[7479]: I0308 00:21:25.014389 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:25.020268 master-0 kubenswrapper[7479]: I0308 00:21:25.020086 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:21:25.029897 master-0 kubenswrapper[7479]: I0308 00:21:25.029329 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:25.050147 master-0 kubenswrapper[7479]: I0308 00:21:25.050083 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:25.080738 master-0 kubenswrapper[7479]: I0308 00:21:25.080477 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:21:25.095715 master-0 kubenswrapper[7479]: I0308 00:21:25.095665 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:25.102002 master-0 kubenswrapper[7479]: I0308 00:21:25.101943 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102021 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102057 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102087 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102092 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102147 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102169 master-0 kubenswrapper[7479]: I0308 00:21:25.102164 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102189 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102252 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102288 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102331 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102443 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102489 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:25.102545 master-0 kubenswrapper[7479]: I0308 00:21:25.102551 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102598 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102630 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102659 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102690 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102724 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102759 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:25.102811 master-0 kubenswrapper[7479]: I0308 00:21:25.102790 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.102834 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.102867 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.102899 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.102929 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.102970 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.103010 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.103076 master-0 kubenswrapper[7479]: I0308 00:21:25.103040 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103088 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103122 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103151 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103179 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103238 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103271 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:25.103387 master-0 kubenswrapper[7479]: I0308 00:21:25.103302 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103406 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103437 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103468 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103497 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103527 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103579 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103608 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.103656 master-0 kubenswrapper[7479]: I0308 00:21:25.103648 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:25.104012 master-0 kubenswrapper[7479]: I0308 00:21:25.103682 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104012 master-0 kubenswrapper[7479]: I0308 00:21:25.103724 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104012 master-0 kubenswrapper[7479]: I0308 00:21:25.103759 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.104012 master-0 kubenswrapper[7479]: I0308 00:21:25.103791 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104012 master-0 kubenswrapper[7479]: I0308 00:21:25.103925 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:25.104257 master-0 kubenswrapper[7479]: E0308 00:21:25.104082 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:25.104257 master-0 kubenswrapper[7479]: E0308 00:21:25.104146 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.604125014 +0000 UTC m=+1.917033961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:25.104257 master-0 kubenswrapper[7479]: I0308 00:21:25.104195 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104399 master-0 kubenswrapper[7479]: I0308 00:21:25.104292 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104450 master-0 kubenswrapper[7479]: I0308 00:21:25.104394 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104513 master-0 kubenswrapper[7479]: E0308 00:21:25.104481 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:25.104574 master-0 kubenswrapper[7479]: I0308 00:21:25.104537 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104574 master-0 kubenswrapper[7479]: E0308 00:21:25.104542 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.604524906 +0000 UTC m=+1.917433923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:25.104574 master-0 kubenswrapper[7479]: I0308 00:21:25.104473 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:25.104574 master-0 kubenswrapper[7479]: E0308 00:21:25.104564 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: I0308 00:21:25.104599 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: E0308 00:21:25.104614 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: I0308 00:21:25.104625 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: E0308 00:21:25.104636 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.604616199 +0000 UTC m=+1.917525116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: I0308 00:21:25.104643 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: E0308 00:21:25.104673 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.6046576 +0000 UTC m=+1.917566547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: I0308 00:21:25.104681 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: I0308 00:21:25.104711 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.104754 master-0 kubenswrapper[7479]: E0308 00:21:25.104763 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: E0308 00:21:25.104824 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.604785554 +0000 UTC m=+1.917694601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: I0308 00:21:25.104879 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: I0308 00:21:25.104966 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: E0308 00:21:25.105003 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: E0308 00:21:25.105039 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.605029432 +0000 UTC m=+1.917938349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: I0308 00:21:25.105077 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: I0308 00:21:25.105113 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105163 master-0 kubenswrapper[7479]: I0308 00:21:25.105150 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: I0308 00:21:25.105184 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: I0308 00:21:25.105239 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: E0308 00:21:25.105298 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: E0308 00:21:25.105328 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.605318621 +0000 UTC m=+1.918227668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: I0308 00:21:25.105358 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: I0308 00:21:25.105390 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: I0308 00:21:25.105420 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: E0308 00:21:25.105474 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:25.105524 master-0 kubenswrapper[7479]: E0308 00:21:25.105500 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.605491116 +0000 UTC m=+1.918400033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105621 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105662 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105688 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105733 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105749 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: E0308 00:21:25.105786 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: E0308 00:21:25.105809 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:25.605801235 +0000 UTC m=+1.918710152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105823 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.105849 master-0 kubenswrapper[7479]: I0308 00:21:25.105846 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.106947 master-0 kubenswrapper[7479]: I0308 00:21:25.105866 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.106947 master-0 kubenswrapper[7479]: I0308 00:21:25.105887 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.106947 master-0 kubenswrapper[7479]: I0308 00:21:25.106542 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.109410 master-0 kubenswrapper[7479]: I0308 00:21:25.109379 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:21:25.134422 master-0 kubenswrapper[7479]: I0308 00:21:25.134076 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:25.149277 master-0 kubenswrapper[7479]: I0308 00:21:25.149237 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:21:25.152042 master-0 kubenswrapper[7479]: I0308 00:21:25.152018 7479 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:21:25.185187 master-0 kubenswrapper[7479]: I0308 00:21:25.182103 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:21:25.201049 master-0 kubenswrapper[7479]: I0308 00:21:25.200959 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:21:25.219917 master-0 kubenswrapper[7479]: I0308 00:21:25.219739 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:21:25.226859 master-0 kubenswrapper[7479]: I0308 00:21:25.226815 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:25.238619 master-0 kubenswrapper[7479]: I0308 00:21:25.238591 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:21:25.242410 master-0 kubenswrapper[7479]: I0308 00:21:25.242381 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.259378 master-0 kubenswrapper[7479]: I0308 00:21:25.259338 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:21:25.264489 master-0 kubenswrapper[7479]: I0308 00:21:25.264455 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.280472 master-0 kubenswrapper[7479]: I0308 00:21:25.280414 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:21:25.314556 master-0 kubenswrapper[7479]: E0308 00:21:25.314516 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:21:25.324809 master-0 kubenswrapper[7479]: E0308 00:21:25.324640 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:25.346691 master-0 kubenswrapper[7479]: E0308 00:21:25.346650 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:21:25.371533 master-0 kubenswrapper[7479]: I0308 00:21:25.371505 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:21:25.389719 master-0 kubenswrapper[7479]: I0308 00:21:25.389680 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:21:25.411356 master-0 kubenswrapper[7479]: I0308 00:21:25.411316 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:25.433006 master-0 kubenswrapper[7479]: I0308 00:21:25.432944 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:21:25.456301 master-0 kubenswrapper[7479]: I0308 00:21:25.456265 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:25.467135 master-0 kubenswrapper[7479]: I0308 00:21:25.467075 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:25.471922 master-0 kubenswrapper[7479]: I0308 00:21:25.471881 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:25.476438 master-0 kubenswrapper[7479]: I0308 00:21:25.476413 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:25.498272 master-0 kubenswrapper[7479]: I0308 00:21:25.498232 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: I0308 00:21:25.510588 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.510718 7479 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: I0308 00:21:25.510769 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.510784 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.510766384 +0000 UTC m=+2.823675311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.510870 7479 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.510915 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.510899768 +0000 UTC m=+2.823808685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: I0308 00:21:25.510932 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: I0308 00:21:25.510994 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.511048 7479 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.511067 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.511061513 +0000 UTC m=+2.823970430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.511103 7479 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:25.512732 master-0 kubenswrapper[7479]: E0308 00:21:25.511120 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.511114105 +0000 UTC m=+2.824023022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:25.514464 master-0 kubenswrapper[7479]: I0308 00:21:25.513931 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:21:25.532992 master-0 kubenswrapper[7479]: I0308 00:21:25.532386 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:21:25.550759 master-0 kubenswrapper[7479]: I0308 00:21:25.550715 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:21:25.572017 master-0 kubenswrapper[7479]: I0308 00:21:25.571734 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:21:25.592930 master-0 kubenswrapper[7479]: I0308 00:21:25.592890 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:21:25.611561 master-0 kubenswrapper[7479]: I0308 00:21:25.611513 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:25.611680 master-0 kubenswrapper[7479]: I0308 00:21:25.611600 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:25.611680 master-0 kubenswrapper[7479]: I0308 00:21:25.611661 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:25.611772 master-0 kubenswrapper[7479]: E0308 00:21:25.611679 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:25.611772 master-0 kubenswrapper[7479]: E0308 00:21:25.611746 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.611725313 +0000 UTC m=+2.924634290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:25.611871 master-0 kubenswrapper[7479]: I0308 00:21:25.611834 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:25.611871 master-0 kubenswrapper[7479]: I0308 00:21:25.611854 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:25.611953 master-0 kubenswrapper[7479]: E0308 00:21:25.611882 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:25.611994 master-0 kubenswrapper[7479]: E0308 00:21:25.611961 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.611938259 +0000 UTC m=+2.924847246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:25.611994 master-0 kubenswrapper[7479]: I0308 00:21:25.611878 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:25.612084 master-0 kubenswrapper[7479]: E0308 00:21:25.612049 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:25.612084 master-0 kubenswrapper[7479]: E0308 00:21:25.612079 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612070263 +0000 UTC m=+2.924979180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:25.612191 master-0 kubenswrapper[7479]: E0308 00:21:25.612164 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:25.612256 master-0 kubenswrapper[7479]: E0308 00:21:25.612230 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:25.612256 master-0 kubenswrapper[7479]: E0308 00:21:25.612246 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612231048 +0000 UTC m=+2.925139965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:25.612341 master-0 kubenswrapper[7479]: E0308 00:21:25.612261 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612255779 +0000 UTC m=+2.925164696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:25.612341 master-0 kubenswrapper[7479]: I0308 00:21:25.612288 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:25.612341 master-0 kubenswrapper[7479]: E0308 00:21:25.612330 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: I0308 00:21:25.612342 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: E0308 00:21:25.612354 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612348202 +0000 UTC m=+2.925257119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: E0308 00:21:25.612392 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: E0308 00:21:25.612415 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612407204 +0000 UTC m=+2.925316111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: I0308 00:21:25.612390 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: E0308 00:21:25.612422 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:25.612454 master-0 kubenswrapper[7479]: E0308 00:21:25.612447 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612439755 +0000 UTC m=+2.925348672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:25.612709 master-0 kubenswrapper[7479]: E0308 00:21:25.612504 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:25.612709 master-0 kubenswrapper[7479]: E0308 00:21:25.612526 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:26.612519367 +0000 UTC m=+2.925428284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:25.626367 master-0 kubenswrapper[7479]: I0308 00:21:25.626304 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:25.633640 master-0 kubenswrapper[7479]: I0308 00:21:25.633595 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:25.651849 master-0 kubenswrapper[7479]: I0308 00:21:25.651794 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:21:25.670727 master-0 kubenswrapper[7479]: I0308 00:21:25.670664 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:25.704418 master-0 kubenswrapper[7479]: I0308 00:21:25.704368 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:25.725750 master-0 kubenswrapper[7479]: I0308 00:21:25.725713 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:25.736666 master-0 kubenswrapper[7479]: I0308 00:21:25.735420 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:21:25.750561 master-0 kubenswrapper[7479]: I0308 00:21:25.750512 7479 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 00:21:25.756122 master-0 kubenswrapper[7479]: I0308 00:21:25.756081 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:25.791909 master-0 kubenswrapper[7479]: I0308 00:21:25.791886 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:25.948903 master-0 kubenswrapper[7479]: I0308 00:21:25.948555 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"4ce369a140420a6c03e974e6eff3c092d5ec9b95e895b002c78c7a3f070c22b2"} Mar 08 00:21:25.951411 master-0 kubenswrapper[7479]: I0308 00:21:25.951022 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"00aa20318a390dc28a1b90d9dfa760b9b264408ce2a090ec0af81099188274b0"} Mar 08 00:21:25.952618 master-0 kubenswrapper[7479]: I0308 00:21:25.952581 7479 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="a8112b99efb51a20fdb91fac566b95eaf004df0ff11f9408140898bfa467ea7c" exitCode=0 Mar 08 00:21:25.952720 master-0 kubenswrapper[7479]: I0308 00:21:25.952625 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"a8112b99efb51a20fdb91fac566b95eaf004df0ff11f9408140898bfa467ea7c"} Mar 08 00:21:25.954546 master-0 kubenswrapper[7479]: I0308 00:21:25.954514 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"ba0bd870ef36ff11021b6ac2e87095fcc7b137992295cf86faa86e55d1530ce8"} Mar 08 00:21:25.958720 master-0 kubenswrapper[7479]: I0308 00:21:25.957616 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" event={"ID":"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8","Type":"ContainerStarted","Data":"ba271e81a6fd420c562722e45c96eb9a2bb2cadcb564df2912b43989b4296570"} Mar 08 00:21:26.038393 master-0 kubenswrapper[7479]: I0308 00:21:26.024518 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w5fjg"] Mar 08 00:21:26.529829 master-0 kubenswrapper[7479]: I0308 00:21:26.529405 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:26.530022 master-0 kubenswrapper[7479]: I0308 00:21:26.529854 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:26.530022 master-0 kubenswrapper[7479]: E0308 00:21:26.529788 7479 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:26.530022 master-0 kubenswrapper[7479]: E0308 00:21:26.529926 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.529913365 +0000 UTC m=+4.842822282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:26.530430 master-0 kubenswrapper[7479]: I0308 00:21:26.530290 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:26.530430 master-0 kubenswrapper[7479]: E0308 00:21:26.530331 7479 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:26.530534 master-0 kubenswrapper[7479]: I0308 00:21:26.530365 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:26.530534 master-0 kubenswrapper[7479]: E0308 00:21:26.530403 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.530384119 +0000 UTC m=+4.843293036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:26.530534 master-0 kubenswrapper[7479]: E0308 00:21:26.530424 7479 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:26.530534 master-0 kubenswrapper[7479]: E0308 00:21:26.530508 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.530499963 +0000 UTC m=+4.843408880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:26.530654 master-0 kubenswrapper[7479]: E0308 00:21:26.530528 7479 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:26.530654 master-0 kubenswrapper[7479]: E0308 00:21:26.530606 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.530586165 +0000 UTC m=+4.843495082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:26.547023 master-0 kubenswrapper[7479]: I0308 00:21:26.546986 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m"] Mar 08 00:21:26.547135 master-0 kubenswrapper[7479]: E0308 00:21:26.547116 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:21:26.547135 master-0 kubenswrapper[7479]: I0308 00:21:26.547130 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:21:26.547249 master-0 kubenswrapper[7479]: E0308 00:21:26.547137 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:21:26.547249 master-0 kubenswrapper[7479]: I0308 00:21:26.547144 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:21:26.547249 master-0 kubenswrapper[7479]: I0308 00:21:26.547230 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfff260-be5c-421c-9158-dfd8fa382e4a" containerName="prober" Mar 08 00:21:26.547249 master-0 kubenswrapper[7479]: I0308 00:21:26.547243 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:21:26.547514 master-0 kubenswrapper[7479]: I0308 00:21:26.547495 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:21:26.559976 master-0 kubenswrapper[7479]: I0308 00:21:26.559943 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m"] Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631724 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631769 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631796 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631813 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631838 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631859 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631880 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631918 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631937 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: I0308 00:21:26.631959 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9fl\" (UniqueName: \"kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl\") pod \"csi-snapshot-controller-7577d6f48-vd52m\" (UID: \"e97435ee-522e-427d-9efc-40bc3d2b0d02\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632071 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632106 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632094071 +0000 UTC m=+4.945002988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632386 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632418 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632410451 +0000 UTC m=+4.945319368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632453 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632472 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632465982 +0000 UTC m=+4.945374899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632505 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632522 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632515234 +0000 UTC m=+4.945424151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632560 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632576 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632570556 +0000 UTC m=+4.945479473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632605 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632621 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632616327 +0000 UTC m=+4.945525234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632649 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632665 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632661008 +0000 UTC m=+4.945569925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632695 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632709 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.63270462 +0000 UTC m=+4.945613537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632738 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:26.634219 master-0 kubenswrapper[7479]: E0308 00:21:26.632753 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.632747981 +0000 UTC m=+4.945656898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:26.732976 master-0 kubenswrapper[7479]: I0308 00:21:26.732824 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9fl\" (UniqueName: \"kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl\") pod \"csi-snapshot-controller-7577d6f48-vd52m\" (UID: \"e97435ee-522e-427d-9efc-40bc3d2b0d02\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:21:26.784627 master-0 kubenswrapper[7479]: I0308 00:21:26.784274 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9fl\" (UniqueName: \"kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl\") pod \"csi-snapshot-controller-7577d6f48-vd52m\" (UID: \"e97435ee-522e-427d-9efc-40bc3d2b0d02\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:21:26.862223 master-0 kubenswrapper[7479]: I0308 00:21:26.861507 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:21:26.994477 master-0 kubenswrapper[7479]: I0308 00:21:26.994429 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerStarted","Data":"ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441"} Mar 08 00:21:27.003948 master-0 kubenswrapper[7479]: I0308 00:21:27.002159 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4"} Mar 08 00:21:27.012320 master-0 kubenswrapper[7479]: I0308 00:21:27.009415 7479 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="1b42fcb0b0ae8c854969b1967188fb3b2c0ac7365173440cfbf5c3f93e5315cf" exitCode=0 Mar 08 00:21:27.012320 master-0 kubenswrapper[7479]: I0308 00:21:27.009480 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"1b42fcb0b0ae8c854969b1967188fb3b2c0ac7365173440cfbf5c3f93e5315cf"} Mar 08 00:21:27.012320 master-0 kubenswrapper[7479]: I0308 00:21:27.012096 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w5fjg" event={"ID":"1f63cb2f-779f-4fde-bf92-cf0414844a77","Type":"ContainerStarted","Data":"36185d93a870a181655e4436861864047a9af33496ef86d20302731ff777317a"} Mar 08 00:21:27.012320 master-0 kubenswrapper[7479]: I0308 00:21:27.012118 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w5fjg" event={"ID":"1f63cb2f-779f-4fde-bf92-cf0414844a77","Type":"ContainerStarted","Data":"fd2c01cdd304d39e575ca69d83c243fee0060006da5d42ff4d10f498f54d4b60"} Mar 08 00:21:27.013587 master-0 kubenswrapper[7479]: I0308 00:21:27.012576 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:27.019297 master-0 kubenswrapper[7479]: I0308 00:21:27.017797 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerStarted","Data":"08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c"} Mar 08 00:21:27.023696 master-0 kubenswrapper[7479]: I0308 00:21:27.022157 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerStarted","Data":"459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00"} Mar 08 00:21:27.028087 master-0 kubenswrapper[7479]: I0308 00:21:27.026782 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:27.028087 master-0 kubenswrapper[7479]: I0308 00:21:27.027347 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerStarted","Data":"322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03"} Mar 08 00:21:27.154992 master-0 kubenswrapper[7479]: I0308 00:21:27.154400 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m"] Mar 08 00:21:27.324869 master-0 kubenswrapper[7479]: I0308 00:21:27.324793 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:27.713277 master-0 kubenswrapper[7479]: I0308 00:21:27.713224 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-tlbts"] Mar 08 00:21:27.715707 master-0 kubenswrapper[7479]: I0308 00:21:27.713656 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.716285 master-0 kubenswrapper[7479]: I0308 00:21:27.716253 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:27.716345 master-0 kubenswrapper[7479]: I0308 00:21:27.716301 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:27.716396 master-0 kubenswrapper[7479]: I0308 00:21:27.716389 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:21:27.717030 master-0 kubenswrapper[7479]: I0308 00:21:27.716459 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:21:27.717030 master-0 kubenswrapper[7479]: I0308 00:21:27.716524 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:21:27.717030 master-0 kubenswrapper[7479]: I0308 00:21:27.716744 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:21:27.730358 master-0 kubenswrapper[7479]: I0308 00:21:27.730313 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-tlbts"] Mar 08 00:21:27.812574 master-0 kubenswrapper[7479]: I0308 00:21:27.812523 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l"] Mar 08 00:21:27.812971 master-0 kubenswrapper[7479]: I0308 00:21:27.812943 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.815557 master-0 kubenswrapper[7479]: I0308 00:21:27.815515 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:21:27.815743 master-0 kubenswrapper[7479]: I0308 00:21:27.815711 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:27.815908 master-0 kubenswrapper[7479]: I0308 00:21:27.815892 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:21:27.816046 master-0 kubenswrapper[7479]: I0308 00:21:27.816032 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:21:27.816149 master-0 kubenswrapper[7479]: I0308 00:21:27.816136 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:27.826317 master-0 kubenswrapper[7479]: I0308 00:21:27.826256 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l"] Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.857827 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.857864 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.857898 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.857918 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckf5z\" (UniqueName: \"kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.857941 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.858052 master-0 kubenswrapper[7479]: I0308 00:21:27.858053 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.858470 master-0 kubenswrapper[7479]: I0308 00:21:27.858075 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnl6p\" (UniqueName: \"kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.858470 master-0 kubenswrapper[7479]: I0308 00:21:27.858141 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.858470 master-0 kubenswrapper[7479]: I0308 00:21:27.858162 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.959064 master-0 kubenswrapper[7479]: I0308 00:21:27.959006 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.959258 master-0 kubenswrapper[7479]: I0308 00:21:27.959078 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnl6p\" (UniqueName: \"kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.959258 master-0 kubenswrapper[7479]: E0308 00:21:27.959181 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 00:21:27.959331 master-0 kubenswrapper[7479]: E0308 00:21:27.959299 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 00:21:27.959469 master-0 kubenswrapper[7479]: E0308 00:21:27.959308 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.459284606 +0000 UTC m=+4.772193573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "config" not found Mar 08 00:21:27.959469 master-0 kubenswrapper[7479]: I0308 00:21:27.959195 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.959469 master-0 kubenswrapper[7479]: E0308 00:21:27.959454 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.459406199 +0000 UTC m=+4.772315116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "openshift-global-ca" not found Mar 08 00:21:27.959602 master-0 kubenswrapper[7479]: I0308 00:21:27.959503 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.959602 master-0 kubenswrapper[7479]: I0308 00:21:27.959561 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: E0308 00:21:27.959691 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: I0308 00:21:27.959741 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: E0308 00:21:27.959774 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.45975038 +0000 UTC m=+4.772659387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : secret "serving-cert" not found Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: E0308 00:21:27.959814 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: I0308 00:21:27.959823 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: I0308 00:21:27.959859 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckf5z\" (UniqueName: \"kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: E0308 00:21:27.959866 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.459857693 +0000 UTC m=+4.772766610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "client-ca" not found Mar 08 00:21:27.959910 master-0 kubenswrapper[7479]: E0308 00:21:27.959915 7479 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:27.960137 master-0 kubenswrapper[7479]: E0308 00:21:27.959950 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.459940566 +0000 UTC m=+4.772849573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : secret "serving-cert" not found Mar 08 00:21:27.960297 master-0 kubenswrapper[7479]: E0308 00:21:27.960274 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 08 00:21:27.962834 master-0 kubenswrapper[7479]: I0308 00:21:27.960675 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.962834 master-0 kubenswrapper[7479]: E0308 00:21:27.960996 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:27.962834 master-0 kubenswrapper[7479]: E0308 00:21:27.962804 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.462772114 +0000 UTC m=+4.775681111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : configmap "config" not found Mar 08 00:21:27.962963 master-0 kubenswrapper[7479]: E0308 00:21:27.962922 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:28.462908848 +0000 UTC m=+4.775817765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : configmap "client-ca" not found Mar 08 00:21:27.981762 master-0 kubenswrapper[7479]: I0308 00:21:27.981674 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnl6p\" (UniqueName: \"kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:27.986640 master-0 kubenswrapper[7479]: I0308 00:21:27.986615 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckf5z\" (UniqueName: \"kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:28.034225 master-0 kubenswrapper[7479]: I0308 00:21:28.025296 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh"] Mar 08 00:21:28.034225 master-0 kubenswrapper[7479]: I0308 00:21:28.025790 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:21:28.034225 master-0 kubenswrapper[7479]: I0308 00:21:28.027642 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:21:28.034225 master-0 kubenswrapper[7479]: I0308 00:21:28.028156 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:21:28.036497 master-0 kubenswrapper[7479]: I0308 00:21:28.036278 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh"] Mar 08 00:21:28.052990 master-0 kubenswrapper[7479]: I0308 00:21:28.047240 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"302cab9bf3dbf255daeb9370ab65a4f19b214019a7009e2da9e307530afd287e"} Mar 08 00:21:28.165321 master-0 kubenswrapper[7479]: I0308 00:21:28.165264 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4xz\" (UniqueName: \"kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz\") pod \"migrator-57ccdf9b5-tbcsh\" (UID: \"2ac55f03-dd6f-4ead-bacc-c69aeca146dc\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:21:28.266511 master-0 kubenswrapper[7479]: I0308 00:21:28.266440 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4xz\" (UniqueName: \"kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz\") pod \"migrator-57ccdf9b5-tbcsh\" (UID: \"2ac55f03-dd6f-4ead-bacc-c69aeca146dc\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:21:28.282976 master-0 kubenswrapper[7479]: I0308 00:21:28.282934 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4xz\" (UniqueName: \"kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz\") pod \"migrator-57ccdf9b5-tbcsh\" (UID: \"2ac55f03-dd6f-4ead-bacc-c69aeca146dc\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:21:28.366440 master-0 kubenswrapper[7479]: I0308 00:21:28.366369 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:21:28.470873 master-0 kubenswrapper[7479]: I0308 00:21:28.470799 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:28.471096 master-0 kubenswrapper[7479]: E0308 00:21:28.470938 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 00:21:28.471096 master-0 kubenswrapper[7479]: E0308 00:21:28.471013 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.470996842 +0000 UTC m=+5.783905759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "config" not found Mar 08 00:21:28.471452 master-0 kubenswrapper[7479]: I0308 00:21:28.471412 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:28.471452 master-0 kubenswrapper[7479]: I0308 00:21:28.471445 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:28.471555 master-0 kubenswrapper[7479]: I0308 00:21:28.471470 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:28.471555 master-0 kubenswrapper[7479]: I0308 00:21:28.471496 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:28.471555 master-0 kubenswrapper[7479]: I0308 00:21:28.471541 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:28.471671 master-0 kubenswrapper[7479]: I0308 00:21:28.471570 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:28.471671 master-0 kubenswrapper[7479]: E0308 00:21:28.471640 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:28.471671 master-0 kubenswrapper[7479]: E0308 00:21:28.471663 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471656162 +0000 UTC m=+5.784565080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : configmap "client-ca" not found Mar 08 00:21:28.471779 master-0 kubenswrapper[7479]: E0308 00:21:28.471688 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 00:21:28.471779 master-0 kubenswrapper[7479]: E0308 00:21:28.471705 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471699414 +0000 UTC m=+5.784608331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "openshift-global-ca" not found Mar 08 00:21:28.471779 master-0 kubenswrapper[7479]: E0308 00:21:28.471759 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:28.471779 master-0 kubenswrapper[7479]: E0308 00:21:28.471776 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471771396 +0000 UTC m=+5.784680313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : secret "serving-cert" not found Mar 08 00:21:28.471915 master-0 kubenswrapper[7479]: E0308 00:21:28.471816 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:28.471915 master-0 kubenswrapper[7479]: E0308 00:21:28.471833 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471828338 +0000 UTC m=+5.784737245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "client-ca" not found Mar 08 00:21:28.471915 master-0 kubenswrapper[7479]: E0308 00:21:28.471865 7479 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:28.471915 master-0 kubenswrapper[7479]: E0308 00:21:28.471882 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471876159 +0000 UTC m=+5.784785076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : secret "serving-cert" not found Mar 08 00:21:28.471915 master-0 kubenswrapper[7479]: E0308 00:21:28.471911 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 08 00:21:28.472080 master-0 kubenswrapper[7479]: E0308 00:21:28.471926 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:29.471921221 +0000 UTC m=+5.784830138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : configmap "config" not found Mar 08 00:21:28.572319 master-0 kubenswrapper[7479]: I0308 00:21:28.572258 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:28.572645 master-0 kubenswrapper[7479]: E0308 00:21:28.572417 7479 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:28.572645 master-0 kubenswrapper[7479]: E0308 00:21:28.572474 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.572456946 +0000 UTC m=+8.885365863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:28.572645 master-0 kubenswrapper[7479]: I0308 00:21:28.572498 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:28.572645 master-0 kubenswrapper[7479]: I0308 00:21:28.572557 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:28.572645 master-0 kubenswrapper[7479]: I0308 00:21:28.572640 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:28.573257 master-0 kubenswrapper[7479]: E0308 00:21:28.572648 7479 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:28.573257 master-0 kubenswrapper[7479]: E0308 00:21:28.572686 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.572675513 +0000 UTC m=+8.885584430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:28.573257 master-0 kubenswrapper[7479]: E0308 00:21:28.572848 7479 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:28.574685 master-0 kubenswrapper[7479]: E0308 00:21:28.572961 7479 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:28.574759 master-0 kubenswrapper[7479]: E0308 00:21:28.574723 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.574683215 +0000 UTC m=+8.887592292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:28.574808 master-0 kubenswrapper[7479]: E0308 00:21:28.574764 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.574754377 +0000 UTC m=+8.887663524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:28.673434 master-0 kubenswrapper[7479]: I0308 00:21:28.673343 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:28.673434 master-0 kubenswrapper[7479]: I0308 00:21:28.673398 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:28.673674 master-0 kubenswrapper[7479]: E0308 00:21:28.673550 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:28.673674 master-0 kubenswrapper[7479]: E0308 00:21:28.673630 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.67361145 +0000 UTC m=+8.986520367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:28.674088 master-0 kubenswrapper[7479]: I0308 00:21:28.674068 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:28.674173 master-0 kubenswrapper[7479]: I0308 00:21:28.674132 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:28.674262 master-0 kubenswrapper[7479]: I0308 00:21:28.674245 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:28.674346 master-0 kubenswrapper[7479]: I0308 00:21:28.674332 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:28.674379 master-0 kubenswrapper[7479]: I0308 00:21:28.674353 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:28.674407 master-0 kubenswrapper[7479]: I0308 00:21:28.674380 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:28.674407 master-0 kubenswrapper[7479]: I0308 00:21:28.674396 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:28.674516 master-0 kubenswrapper[7479]: E0308 00:21:28.674501 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:28.674574 master-0 kubenswrapper[7479]: E0308 00:21:28.674562 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.67455313 +0000 UTC m=+8.987462047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:28.674626 master-0 kubenswrapper[7479]: E0308 00:21:28.674604 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:28.674658 master-0 kubenswrapper[7479]: E0308 00:21:28.674637 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.674631442 +0000 UTC m=+8.987540359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674697 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674718 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.674712525 +0000 UTC m=+8.987621442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674761 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674794 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674822 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:28.674848 master-0 kubenswrapper[7479]: E0308 00:21:28.674839 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.674821708 +0000 UTC m=+8.987730625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:28.675032 master-0 kubenswrapper[7479]: E0308 00:21:28.674860 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.674853959 +0000 UTC m=+8.987762866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:28.675032 master-0 kubenswrapper[7479]: E0308 00:21:28.674870 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:28.675032 master-0 kubenswrapper[7479]: E0308 00:21:28.674875 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.674867229 +0000 UTC m=+8.987776146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:28.675032 master-0 kubenswrapper[7479]: E0308 00:21:28.674890 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.67488222 +0000 UTC m=+8.987791137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:28.675032 master-0 kubenswrapper[7479]: E0308 00:21:28.674988 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:28.675161 master-0 kubenswrapper[7479]: E0308 00:21:28.675068 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:32.675047795 +0000 UTC m=+8.987956712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:28.773519 master-0 kubenswrapper[7479]: I0308 00:21:28.773478 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:28.773735 master-0 kubenswrapper[7479]: I0308 00:21:28.773598 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:28.777327 master-0 kubenswrapper[7479]: I0308 00:21:28.777249 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:21:29.058571 master-0 kubenswrapper[7479]: I0308 00:21:29.058515 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfnqf" event={"ID":"0e52cbdc-1d46-4cc9-85ee-535aa449992f","Type":"ContainerStarted","Data":"0cd7d1d536e3e73fb9ed25ec4d69ad5db01a51017e617e72f4fa58f319d499f9"} Mar 08 00:21:29.197938 master-0 kubenswrapper[7479]: I0308 00:21:29.197869 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-tlbts"] Mar 08 00:21:29.198193 master-0 kubenswrapper[7479]: E0308 00:21:29.198164 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" podUID="13debc51-f16e-4a27-ba79-e2da1e3ed46b" Mar 08 00:21:29.217224 master-0 kubenswrapper[7479]: I0308 00:21:29.214486 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l"] Mar 08 00:21:29.217224 master-0 kubenswrapper[7479]: E0308 00:21:29.214780 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" podUID="07a7947d-7425-4df7-b121-2d23c7868e79" Mar 08 00:21:29.489417 master-0 kubenswrapper[7479]: I0308 00:21:29.489324 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.490286 master-0 kubenswrapper[7479]: I0308 00:21:29.490229 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.490286 master-0 kubenswrapper[7479]: I0308 00:21:29.490282 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:29.490432 master-0 kubenswrapper[7479]: I0308 00:21:29.490396 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.490432 master-0 kubenswrapper[7479]: I0308 00:21:29.490397 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.490506 master-0 kubenswrapper[7479]: E0308 00:21:29.490479 7479 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:29.490544 master-0 kubenswrapper[7479]: E0308 00:21:29.490532 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:31.490516594 +0000 UTC m=+7.803425521 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : secret "serving-cert" not found Mar 08 00:21:29.490581 master-0 kubenswrapper[7479]: I0308 00:21:29.490555 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.490610 master-0 kubenswrapper[7479]: E0308 00:21:29.490560 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:29.490636 master-0 kubenswrapper[7479]: E0308 00:21:29.490627 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:31.490618117 +0000 UTC m=+7.803527044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : secret "serving-cert" not found Mar 08 00:21:29.490666 master-0 kubenswrapper[7479]: E0308 00:21:29.490647 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:29.491108 master-0 kubenswrapper[7479]: E0308 00:21:29.490697 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:31.490683369 +0000 UTC m=+7.803592306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : configmap "client-ca" not found Mar 08 00:21:29.491108 master-0 kubenswrapper[7479]: I0308 00:21:29.490769 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:29.491108 master-0 kubenswrapper[7479]: I0308 00:21:29.490967 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:29.491108 master-0 kubenswrapper[7479]: E0308 00:21:29.491010 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:29.491108 master-0 kubenswrapper[7479]: E0308 00:21:29.491080 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:31.491068121 +0000 UTC m=+7.803977158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : configmap "client-ca" not found Mar 08 00:21:29.491385 master-0 kubenswrapper[7479]: I0308 00:21:29.491350 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:29.491713 master-0 kubenswrapper[7479]: I0308 00:21:29.491674 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:30.006366 master-0 kubenswrapper[7479]: I0308 00:21:30.005383 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-bc2m2"] Mar 08 00:21:30.006366 master-0 kubenswrapper[7479]: I0308 00:21:30.005937 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.008090 master-0 kubenswrapper[7479]: I0308 00:21:30.008060 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:21:30.011381 master-0 kubenswrapper[7479]: I0308 00:21:30.011272 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-bc2m2"] Mar 08 00:21:30.011909 master-0 kubenswrapper[7479]: I0308 00:21:30.011847 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:21:30.012118 master-0 kubenswrapper[7479]: I0308 00:21:30.012038 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:21:30.019507 master-0 kubenswrapper[7479]: I0308 00:21:30.017721 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:21:30.062120 master-0 kubenswrapper[7479]: I0308 00:21:30.061559 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:30.062120 master-0 kubenswrapper[7479]: I0308 00:21:30.061789 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:30.097876 master-0 kubenswrapper[7479]: I0308 00:21:30.097603 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.097876 master-0 kubenswrapper[7479]: I0308 00:21:30.097667 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.097876 master-0 kubenswrapper[7479]: I0308 00:21:30.097758 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcl7q\" (UniqueName: \"kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.139430 master-0 kubenswrapper[7479]: I0308 00:21:30.139142 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh"] Mar 08 00:21:30.152654 master-0 kubenswrapper[7479]: W0308 00:21:30.152605 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac55f03_dd6f_4ead_bacc_c69aeca146dc.slice/crio-0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf WatchSource:0}: Error finding container 0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf: Status 404 returned error can't find the container with id 0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf Mar 08 00:21:30.158329 master-0 kubenswrapper[7479]: I0308 00:21:30.158302 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:30.198161 master-0 kubenswrapper[7479]: I0308 00:21:30.198120 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnl6p\" (UniqueName: \"kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p\") pod \"07a7947d-7425-4df7-b121-2d23c7868e79\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " Mar 08 00:21:30.198263 master-0 kubenswrapper[7479]: I0308 00:21:30.198191 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") pod \"07a7947d-7425-4df7-b121-2d23c7868e79\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " Mar 08 00:21:30.198618 master-0 kubenswrapper[7479]: I0308 00:21:30.198558 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.198689 master-0 kubenswrapper[7479]: I0308 00:21:30.198663 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.200802 master-0 kubenswrapper[7479]: I0308 00:21:30.198762 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcl7q\" (UniqueName: \"kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.202873 master-0 kubenswrapper[7479]: I0308 00:21:30.202823 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config" (OuterVolumeSpecName: "config") pod "07a7947d-7425-4df7-b121-2d23c7868e79" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:30.203442 master-0 kubenswrapper[7479]: I0308 00:21:30.203415 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.208361 master-0 kubenswrapper[7479]: I0308 00:21:30.208314 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.209172 master-0 kubenswrapper[7479]: I0308 00:21:30.209132 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p" (OuterVolumeSpecName: "kube-api-access-mnl6p") pod "07a7947d-7425-4df7-b121-2d23c7868e79" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79"). InnerVolumeSpecName "kube-api-access-mnl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:30.222019 master-0 kubenswrapper[7479]: I0308 00:21:30.221986 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcl7q\" (UniqueName: \"kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.228805 master-0 kubenswrapper[7479]: I0308 00:21:30.227138 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:30.301938 master-0 kubenswrapper[7479]: I0308 00:21:30.301874 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckf5z\" (UniqueName: \"kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z\") pod \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " Mar 08 00:21:30.302102 master-0 kubenswrapper[7479]: I0308 00:21:30.301960 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") pod \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " Mar 08 00:21:30.302178 master-0 kubenswrapper[7479]: I0308 00:21:30.302140 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") pod \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " Mar 08 00:21:30.302490 master-0 kubenswrapper[7479]: I0308 00:21:30.302466 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnl6p\" (UniqueName: \"kubernetes.io/projected/07a7947d-7425-4df7-b121-2d23c7868e79-kube-api-access-mnl6p\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:30.302490 master-0 kubenswrapper[7479]: I0308 00:21:30.302487 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:30.302553 master-0 kubenswrapper[7479]: I0308 00:21:30.302467 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "13debc51-f16e-4a27-ba79-e2da1e3ed46b" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:30.302773 master-0 kubenswrapper[7479]: I0308 00:21:30.302739 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config" (OuterVolumeSpecName: "config") pod "13debc51-f16e-4a27-ba79-e2da1e3ed46b" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:30.307685 master-0 kubenswrapper[7479]: I0308 00:21:30.307648 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z" (OuterVolumeSpecName: "kube-api-access-ckf5z") pod "13debc51-f16e-4a27-ba79-e2da1e3ed46b" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b"). InnerVolumeSpecName "kube-api-access-ckf5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:30.386938 master-0 kubenswrapper[7479]: I0308 00:21:30.386876 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:21:30.403944 master-0 kubenswrapper[7479]: I0308 00:21:30.403907 7479 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:30.404016 master-0 kubenswrapper[7479]: I0308 00:21:30.403949 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:30.404016 master-0 kubenswrapper[7479]: I0308 00:21:30.403970 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckf5z\" (UniqueName: \"kubernetes.io/projected/13debc51-f16e-4a27-ba79-e2da1e3ed46b-kube-api-access-ckf5z\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:30.688813 master-0 kubenswrapper[7479]: I0308 00:21:30.688729 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:30.764785 master-0 kubenswrapper[7479]: I0308 00:21:30.764745 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-bc2m2"] Mar 08 00:21:30.771232 master-0 kubenswrapper[7479]: W0308 00:21:30.771170 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f5539c1_fb87_42d6_b735_6de53421bb6b.slice/crio-79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865 WatchSource:0}: Error finding container 79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865: Status 404 returned error can't find the container with id 79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865 Mar 08 00:21:31.066762 master-0 kubenswrapper[7479]: I0308 00:21:31.066620 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7"} Mar 08 00:21:31.067823 master-0 kubenswrapper[7479]: I0308 00:21:31.066862 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:31.068187 master-0 kubenswrapper[7479]: I0308 00:21:31.068144 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae"} Mar 08 00:21:31.068907 master-0 kubenswrapper[7479]: I0308 00:21:31.068868 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf"} Mar 08 00:21:31.070354 master-0 kubenswrapper[7479]: I0308 00:21:31.070319 7479 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="09b799c18c45feaba6859a57b3c549da1772578d33ab2e69691bfdb4a7740bc3" exitCode=0 Mar 08 00:21:31.070424 master-0 kubenswrapper[7479]: I0308 00:21:31.070375 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"09b799c18c45feaba6859a57b3c549da1772578d33ab2e69691bfdb4a7740bc3"} Mar 08 00:21:31.071704 master-0 kubenswrapper[7479]: I0308 00:21:31.071274 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" event={"ID":"4f5539c1-fb87-42d6-b735-6de53421bb6b","Type":"ContainerStarted","Data":"79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865"} Mar 08 00:21:31.071704 master-0 kubenswrapper[7479]: I0308 00:21:31.071308 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:31.071704 master-0 kubenswrapper[7479]: I0308 00:21:31.071331 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:31.197897 master-0 kubenswrapper[7479]: I0308 00:21:31.197824 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:31.217622 master-0 kubenswrapper[7479]: I0308 00:21:31.217567 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:31.565336 master-0 kubenswrapper[7479]: I0308 00:21:31.565261 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:31.565514 master-0 kubenswrapper[7479]: I0308 00:21:31.565350 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:31.565514 master-0 kubenswrapper[7479]: I0308 00:21:31.565418 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:31.565514 master-0 kubenswrapper[7479]: I0308 00:21:31.565445 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") pod \"controller-manager-6f7fd6c796-tlbts\" (UID: \"13debc51-f16e-4a27-ba79-e2da1e3ed46b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-tlbts" Mar 08 00:21:31.565514 master-0 kubenswrapper[7479]: E0308 00:21:31.565504 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: object "openshift-route-controller-manager"/"serving-cert" not registered Mar 08 00:21:31.565660 master-0 kubenswrapper[7479]: E0308 00:21:31.565541 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: object "openshift-controller-manager"/"client-ca" not registered Mar 08 00:21:31.565660 master-0 kubenswrapper[7479]: E0308 00:21:31.565556 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:35.565541992 +0000 UTC m=+11.878450909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : object "openshift-route-controller-manager"/"serving-cert" not registered Mar 08 00:21:31.565660 master-0 kubenswrapper[7479]: E0308 00:21:31.565633 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:35.565616614 +0000 UTC m=+11.878525531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : object "openshift-controller-manager"/"client-ca" not registered Mar 08 00:21:31.565761 master-0 kubenswrapper[7479]: E0308 00:21:31.565671 7479 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: object "openshift-controller-manager"/"serving-cert" not registered Mar 08 00:21:31.566032 master-0 kubenswrapper[7479]: E0308 00:21:31.565703 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: object "openshift-route-controller-manager"/"client-ca" not registered Mar 08 00:21:31.566032 master-0 kubenswrapper[7479]: I0308 00:21:31.565680 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") pod \"route-controller-manager-58959cd4d6-d985l\" (UID: \"07a7947d-7425-4df7-b121-2d23c7868e79\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l" Mar 08 00:21:31.566107 master-0 kubenswrapper[7479]: E0308 00:21:31.565712 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert podName:13debc51-f16e-4a27-ba79-e2da1e3ed46b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:35.565702087 +0000 UTC m=+11.878611094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert") pod "controller-manager-6f7fd6c796-tlbts" (UID: "13debc51-f16e-4a27-ba79-e2da1e3ed46b") : object "openshift-controller-manager"/"serving-cert" not registered Mar 08 00:21:31.566107 master-0 kubenswrapper[7479]: E0308 00:21:31.566084 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca podName:07a7947d-7425-4df7-b121-2d23c7868e79 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:35.566065388 +0000 UTC m=+11.878974305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca") pod "route-controller-manager-58959cd4d6-d985l" (UID: "07a7947d-7425-4df7-b121-2d23c7868e79") : object "openshift-route-controller-manager"/"client-ca" not registered Mar 08 00:21:31.569643 master-0 kubenswrapper[7479]: I0308 00:21:31.569615 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:32.075999 master-0 kubenswrapper[7479]: I0308 00:21:32.075947 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:32.075999 master-0 kubenswrapper[7479]: I0308 00:21:32.075972 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:32.075999 master-0 kubenswrapper[7479]: I0308 00:21:32.075974 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" event={"ID":"4f5539c1-fb87-42d6-b735-6de53421bb6b","Type":"ContainerStarted","Data":"2d85b8a41ac1a5d7ec38487553cf098502219f3e61e7670ec8b3fc64cf28df17"} Mar 08 00:21:32.079636 master-0 kubenswrapper[7479]: I0308 00:21:32.079609 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:32.576221 master-0 kubenswrapper[7479]: I0308 00:21:32.576171 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:32.576454 master-0 kubenswrapper[7479]: E0308 00:21:32.576350 7479 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:32.576454 master-0 kubenswrapper[7479]: E0308 00:21:32.576415 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls podName:4d0b9fbc-a1f8-4a98-99de-758734bd1a5b nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.576390275 +0000 UTC m=+16.889299192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls") pod "ingress-operator-677db989d6-blw5x" (UID: "4d0b9fbc-a1f8-4a98-99de-758734bd1a5b") : secret "metrics-tls" not found Mar 08 00:21:32.576619 master-0 kubenswrapper[7479]: I0308 00:21:32.576566 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:32.576701 master-0 kubenswrapper[7479]: E0308 00:21:32.576679 7479 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:32.576807 master-0 kubenswrapper[7479]: I0308 00:21:32.576776 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:32.576859 master-0 kubenswrapper[7479]: E0308 00:21:32.576795 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert podName:32c19760-2cb2-4690-be8e-cba3c517c60e nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.576784677 +0000 UTC m=+16.889693594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert") pod "cluster-version-operator-745944c6b7-dcbvq" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e") : secret "cluster-version-operator-serving-cert" not found Mar 08 00:21:32.576911 master-0 kubenswrapper[7479]: I0308 00:21:32.576880 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:32.576952 master-0 kubenswrapper[7479]: E0308 00:21:32.576835 7479 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 00:21:32.576991 master-0 kubenswrapper[7479]: E0308 00:21:32.576955 7479 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 00:21:32.577035 master-0 kubenswrapper[7479]: E0308 00:21:32.576958 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls podName:03f4bafb-c270-428a-bacf-8a424b3d1a05 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.576945752 +0000 UTC m=+16.889854769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls") pod "dns-operator-589895fbb7-gmvnl" (UID: "03f4bafb-c270-428a-bacf-8a424b3d1a05") : secret "metrics-tls" not found Mar 08 00:21:32.577035 master-0 kubenswrapper[7479]: E0308 00:21:32.577011 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls podName:6999cf38-e317-4727-98c9-d4e348e9e16a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.576998284 +0000 UTC m=+16.889907271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-k7dp2" (UID: "6999cf38-e317-4727-98c9-d4e348e9e16a") : secret "image-registry-operator-tls" not found Mar 08 00:21:32.680691 master-0 kubenswrapper[7479]: I0308 00:21:32.680629 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:32.680691 master-0 kubenswrapper[7479]: I0308 00:21:32.680690 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:32.680903 master-0 kubenswrapper[7479]: E0308 00:21:32.680794 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:32.680903 master-0 kubenswrapper[7479]: I0308 00:21:32.680852 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:32.680903 master-0 kubenswrapper[7479]: E0308 00:21:32.680865 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:32.680983 master-0 kubenswrapper[7479]: E0308 00:21:32.680869 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.680852382 +0000 UTC m=+16.993761299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:32.680983 master-0 kubenswrapper[7479]: I0308 00:21:32.680955 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:32.681048 master-0 kubenswrapper[7479]: E0308 00:21:32.680984 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:32.681048 master-0 kubenswrapper[7479]: E0308 00:21:32.681024 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681006617 +0000 UTC m=+16.993915534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:32.681102 master-0 kubenswrapper[7479]: E0308 00:21:32.681060 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 00:21:32.681102 master-0 kubenswrapper[7479]: I0308 00:21:32.681072 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:32.681102 master-0 kubenswrapper[7479]: E0308 00:21:32.681093 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681081319 +0000 UTC m=+16.993990316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "node-tuning-operator-tls" not found Mar 08 00:21:32.681183 master-0 kubenswrapper[7479]: I0308 00:21:32.681113 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:32.681183 master-0 kubenswrapper[7479]: E0308 00:21:32.681120 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:32.681183 master-0 kubenswrapper[7479]: E0308 00:21:32.681129 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.68112308 +0000 UTC m=+16.994031987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:32.681183 master-0 kubenswrapper[7479]: E0308 00:21:32.681166 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: E0308 00:21:32.681192 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681186162 +0000 UTC m=+16.994095079 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: I0308 00:21:32.681221 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: I0308 00:21:32.681241 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: I0308 00:21:32.681269 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: E0308 00:21:32.681283 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681267085 +0000 UTC m=+16.994176042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:32.681315 master-0 kubenswrapper[7479]: E0308 00:21:32.681287 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:32.681479 master-0 kubenswrapper[7479]: E0308 00:21:32.681324 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681314576 +0000 UTC m=+16.994223493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:32.681479 master-0 kubenswrapper[7479]: E0308 00:21:32.681326 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:32.681479 master-0 kubenswrapper[7479]: E0308 00:21:32.681335 7479 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:32.681479 master-0 kubenswrapper[7479]: E0308 00:21:32.681353 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681347907 +0000 UTC m=+16.994256824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:32.681479 master-0 kubenswrapper[7479]: E0308 00:21:32.681369 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert podName:1abf904b-0b8d-4d61-8231-0e8d00933192 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.681360008 +0000 UTC m=+16.994269005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vjl9" (UID: "1abf904b-0b8d-4d61-8231-0e8d00933192") : secret "performance-addon-operator-webhook-cert" not found Mar 08 00:21:33.144811 master-0 kubenswrapper[7479]: I0308 00:21:33.144767 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl"] Mar 08 00:21:33.145557 master-0 kubenswrapper[7479]: I0308 00:21:33.145221 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.147148 master-0 kubenswrapper[7479]: I0308 00:21:33.146698 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:21:33.147668 master-0 kubenswrapper[7479]: I0308 00:21:33.147651 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:33.147851 master-0 kubenswrapper[7479]: I0308 00:21:33.147837 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:33.147992 master-0 kubenswrapper[7479]: I0308 00:21:33.147980 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:21:33.148114 master-0 kubenswrapper[7479]: I0308 00:21:33.148103 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:21:33.287583 master-0 kubenswrapper[7479]: I0308 00:21:33.287516 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.287799 master-0 kubenswrapper[7479]: I0308 00:21:33.287673 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.287799 master-0 kubenswrapper[7479]: I0308 00:21:33.287727 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.288009 master-0 kubenswrapper[7479]: I0308 00:21:33.287828 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvt7\" (UniqueName: \"kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.306409 master-0 kubenswrapper[7479]: I0308 00:21:33.306314 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l"] Mar 08 00:21:33.307775 master-0 kubenswrapper[7479]: I0308 00:21:33.307742 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl"] Mar 08 00:21:33.388923 master-0 kubenswrapper[7479]: I0308 00:21:33.388858 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.389123 master-0 kubenswrapper[7479]: E0308 00:21:33.388989 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:33.389123 master-0 kubenswrapper[7479]: E0308 00:21:33.389050 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:33.889032386 +0000 UTC m=+10.201941303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:33.389123 master-0 kubenswrapper[7479]: I0308 00:21:33.389096 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvt7\" (UniqueName: \"kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.389295 master-0 kubenswrapper[7479]: I0308 00:21:33.389127 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.389376 master-0 kubenswrapper[7479]: I0308 00:21:33.389341 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.389486 master-0 kubenswrapper[7479]: E0308 00:21:33.389456 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:33.389539 master-0 kubenswrapper[7479]: E0308 00:21:33.389501 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:33.889491691 +0000 UTC m=+10.202400608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : secret "serving-cert" not found Mar 08 00:21:33.390486 master-0 kubenswrapper[7479]: I0308 00:21:33.390454 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.433246 master-0 kubenswrapper[7479]: I0308 00:21:33.433035 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-d985l"] Mar 08 00:21:33.490318 master-0 kubenswrapper[7479]: I0308 00:21:33.490269 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a7947d-7425-4df7-b121-2d23c7868e79-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:33.490318 master-0 kubenswrapper[7479]: I0308 00:21:33.490310 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a7947d-7425-4df7-b121-2d23c7868e79-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:33.815719 master-0 kubenswrapper[7479]: I0308 00:21:33.812246 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvt7\" (UniqueName: \"kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.890299 master-0 kubenswrapper[7479]: I0308 00:21:33.889981 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a7947d-7425-4df7-b121-2d23c7868e79" path="/var/lib/kubelet/pods/07a7947d-7425-4df7-b121-2d23c7868e79/volumes" Mar 08 00:21:33.908800 master-0 kubenswrapper[7479]: I0308 00:21:33.908756 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.908800 master-0 kubenswrapper[7479]: I0308 00:21:33.908804 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:33.909036 master-0 kubenswrapper[7479]: E0308 00:21:33.908917 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:33.909079 master-0 kubenswrapper[7479]: E0308 00:21:33.909046 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:33.909115 master-0 kubenswrapper[7479]: E0308 00:21:33.909083 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:34.909069201 +0000 UTC m=+11.221978118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:33.909162 master-0 kubenswrapper[7479]: E0308 00:21:33.909115 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:34.909097622 +0000 UTC m=+11.222006549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : secret "serving-cert" not found Mar 08 00:21:33.953599 master-0 kubenswrapper[7479]: I0308 00:21:33.953522 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" podStartSLOduration=4.953505178 podStartE2EDuration="4.953505178s" podCreationTimestamp="2026-03-08 00:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:33.952781696 +0000 UTC m=+10.265690613" watchObservedRunningTime="2026-03-08 00:21:33.953505178 +0000 UTC m=+10.266414095" Mar 08 00:21:34.006706 master-0 kubenswrapper[7479]: I0308 00:21:34.006663 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:21:34.159422 master-0 kubenswrapper[7479]: I0308 00:21:34.159315 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:34.162721 master-0 kubenswrapper[7479]: I0308 00:21:34.162698 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:34.480234 master-0 kubenswrapper[7479]: I0308 00:21:34.480105 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-tlbts"] Mar 08 00:21:34.521788 master-0 kubenswrapper[7479]: I0308 00:21:34.520005 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-tlbts"] Mar 08 00:21:34.615794 master-0 kubenswrapper[7479]: I0308 00:21:34.615761 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13debc51-f16e-4a27-ba79-e2da1e3ed46b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:34.615954 master-0 kubenswrapper[7479]: I0308 00:21:34.615807 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13debc51-f16e-4a27-ba79-e2da1e3ed46b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:34.919104 master-0 kubenswrapper[7479]: I0308 00:21:34.919066 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:34.919272 master-0 kubenswrapper[7479]: I0308 00:21:34.919251 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:34.919392 master-0 kubenswrapper[7479]: E0308 00:21:34.919370 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:34.919436 master-0 kubenswrapper[7479]: E0308 00:21:34.919414 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:36.919400307 +0000 UTC m=+13.232309224 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : secret "serving-cert" not found Mar 08 00:21:34.919484 master-0 kubenswrapper[7479]: E0308 00:21:34.919444 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:34.919484 master-0 kubenswrapper[7479]: E0308 00:21:34.919462 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:36.919456409 +0000 UTC m=+13.232365326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:35.093144 master-0 kubenswrapper[7479]: I0308 00:21:35.092806 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"103608f45f7694c0e9140e9dcbf75b86f00880d60c9896b112d4ee32ecef5e6c"} Mar 08 00:21:35.097005 master-0 kubenswrapper[7479]: I0308 00:21:35.096939 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:21:35.457012 master-0 kubenswrapper[7479]: I0308 00:21:35.451365 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" podStartSLOduration=6.615045558 podStartE2EDuration="9.451339095s" podCreationTimestamp="2026-03-08 00:21:26 +0000 UTC" firstStartedPulling="2026-03-08 00:21:27.17938627 +0000 UTC m=+3.492295187" lastFinishedPulling="2026-03-08 00:21:30.015679817 +0000 UTC m=+6.328588724" observedRunningTime="2026-03-08 00:21:35.448805112 +0000 UTC m=+11.761714029" watchObservedRunningTime="2026-03-08 00:21:35.451339095 +0000 UTC m=+11.764248012" Mar 08 00:21:35.652643 master-0 kubenswrapper[7479]: I0308 00:21:35.651701 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:35.652643 master-0 kubenswrapper[7479]: I0308 00:21:35.651868 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:35.652643 master-0 kubenswrapper[7479]: I0308 00:21:35.651877 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:35.672002 master-0 kubenswrapper[7479]: I0308 00:21:35.671933 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:35.890223 master-0 kubenswrapper[7479]: I0308 00:21:35.890152 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13debc51-f16e-4a27-ba79-e2da1e3ed46b" path="/var/lib/kubelet/pods/13debc51-f16e-4a27-ba79-e2da1e3ed46b/volumes" Mar 08 00:21:36.096223 master-0 kubenswrapper[7479]: I0308 00:21:36.096026 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:36.100249 master-0 kubenswrapper[7479]: I0308 00:21:36.097032 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"0c63daa306d3bdb05b05608e24f1b29d4e891aa0f9db9588343aba567dfea148"} Mar 08 00:21:36.353870 master-0 kubenswrapper[7479]: I0308 00:21:36.353754 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7775b8f858-tgbrj"] Mar 08 00:21:36.354892 master-0 kubenswrapper[7479]: I0308 00:21:36.354850 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.363003 master-0 kubenswrapper[7479]: I0308 00:21:36.362970 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:36.363261 master-0 kubenswrapper[7479]: I0308 00:21:36.363241 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:36.363399 master-0 kubenswrapper[7479]: I0308 00:21:36.363382 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:21:36.363508 master-0 kubenswrapper[7479]: I0308 00:21:36.363494 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:21:36.363864 master-0 kubenswrapper[7479]: I0308 00:21:36.363841 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:21:36.369642 master-0 kubenswrapper[7479]: I0308 00:21:36.369603 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:21:36.439543 master-0 kubenswrapper[7479]: I0308 00:21:36.439466 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.439543 master-0 kubenswrapper[7479]: I0308 00:21:36.439545 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjn9j\" (UniqueName: \"kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.439804 master-0 kubenswrapper[7479]: I0308 00:21:36.439589 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.439804 master-0 kubenswrapper[7479]: I0308 00:21:36.439630 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.439935 master-0 kubenswrapper[7479]: I0308 00:21:36.439871 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.541290 master-0 kubenswrapper[7479]: I0308 00:21:36.541195 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.541290 master-0 kubenswrapper[7479]: I0308 00:21:36.541275 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjn9j\" (UniqueName: \"kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.541290 master-0 kubenswrapper[7479]: I0308 00:21:36.541300 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.542578 master-0 kubenswrapper[7479]: I0308 00:21:36.541331 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.542578 master-0 kubenswrapper[7479]: E0308 00:21:36.541594 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:36.542578 master-0 kubenswrapper[7479]: I0308 00:21:36.541660 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.542578 master-0 kubenswrapper[7479]: E0308 00:21:36.541711 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca podName:5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:37.041691761 +0000 UTC m=+13.354600678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca") pod "controller-manager-7775b8f858-tgbrj" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a") : configmap "client-ca" not found Mar 08 00:21:36.542952 master-0 kubenswrapper[7479]: I0308 00:21:36.542789 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.543524 master-0 kubenswrapper[7479]: I0308 00:21:36.543459 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.546735 master-0 kubenswrapper[7479]: I0308 00:21:36.546697 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.655537 master-0 kubenswrapper[7479]: I0308 00:21:36.655415 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7775b8f858-tgbrj"] Mar 08 00:21:36.841225 master-0 kubenswrapper[7479]: I0308 00:21:36.841147 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjn9j\" (UniqueName: \"kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:36.945861 master-0 kubenswrapper[7479]: I0308 00:21:36.945723 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:36.946019 master-0 kubenswrapper[7479]: E0308 00:21:36.945917 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:36.946019 master-0 kubenswrapper[7479]: I0308 00:21:36.945958 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:36.946019 master-0 kubenswrapper[7479]: E0308 00:21:36.945989 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.945972691 +0000 UTC m=+17.258881608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : secret "serving-cert" not found Mar 08 00:21:36.946118 master-0 kubenswrapper[7479]: E0308 00:21:36.946074 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:36.946180 master-0 kubenswrapper[7479]: E0308 00:21:36.946150 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.946123416 +0000 UTC m=+17.259032363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:37.047789 master-0 kubenswrapper[7479]: I0308 00:21:37.047649 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:37.047963 master-0 kubenswrapper[7479]: E0308 00:21:37.047900 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:37.048073 master-0 kubenswrapper[7479]: E0308 00:21:37.048036 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca podName:5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:38.048009415 +0000 UTC m=+14.360918372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca") pod "controller-manager-7775b8f858-tgbrj" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a") : configmap "client-ca" not found Mar 08 00:21:37.859006 master-0 kubenswrapper[7479]: I0308 00:21:37.858915 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" podStartSLOduration=5.502356096 podStartE2EDuration="9.85889044s" podCreationTimestamp="2026-03-08 00:21:28 +0000 UTC" firstStartedPulling="2026-03-08 00:21:30.15581714 +0000 UTC m=+6.468726057" lastFinishedPulling="2026-03-08 00:21:34.512351484 +0000 UTC m=+10.825260401" observedRunningTime="2026-03-08 00:21:37.343762898 +0000 UTC m=+13.656671865" watchObservedRunningTime="2026-03-08 00:21:37.85889044 +0000 UTC m=+14.171799377" Mar 08 00:21:38.060785 master-0 kubenswrapper[7479]: I0308 00:21:38.060476 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:38.060982 master-0 kubenswrapper[7479]: E0308 00:21:38.060635 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:38.060982 master-0 kubenswrapper[7479]: E0308 00:21:38.060851 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca podName:5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:40.060835828 +0000 UTC m=+16.373744745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca") pod "controller-manager-7775b8f858-tgbrj" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a") : configmap "client-ca" not found Mar 08 00:21:38.104716 master-0 kubenswrapper[7479]: I0308 00:21:38.104662 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerStarted","Data":"7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2"} Mar 08 00:21:40.086130 master-0 kubenswrapper[7479]: I0308 00:21:40.086077 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") pod \"controller-manager-7775b8f858-tgbrj\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:40.086797 master-0 kubenswrapper[7479]: E0308 00:21:40.086255 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:40.086797 master-0 kubenswrapper[7479]: E0308 00:21:40.086322 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca podName:5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a nodeName:}" failed. No retries permitted until 2026-03-08 00:21:44.086307811 +0000 UTC m=+20.399216728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca") pod "controller-manager-7775b8f858-tgbrj" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a") : configmap "client-ca" not found Mar 08 00:21:40.591518 master-0 kubenswrapper[7479]: I0308 00:21:40.591418 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:40.591794 master-0 kubenswrapper[7479]: I0308 00:21:40.591585 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:40.591794 master-0 kubenswrapper[7479]: I0308 00:21:40.591754 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:40.591910 master-0 kubenswrapper[7479]: I0308 00:21:40.591820 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:40.597226 master-0 kubenswrapper[7479]: I0308 00:21:40.595863 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:40.601228 master-0 kubenswrapper[7479]: I0308 00:21:40.597986 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:40.607037 master-0 kubenswrapper[7479]: I0308 00:21:40.606969 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:40.607037 master-0 kubenswrapper[7479]: I0308 00:21:40.607015 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"cluster-version-operator-745944c6b7-dcbvq\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:40.692329 master-0 kubenswrapper[7479]: I0308 00:21:40.692262 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:40.692329 master-0 kubenswrapper[7479]: I0308 00:21:40.692339 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: I0308 00:21:40.692397 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: E0308 00:21:40.692473 7479 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: E0308 00:21:40.692508 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: E0308 00:21:40.692555 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics podName:5cf5a2ef-2498-40a0-a189-0753076fd3b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.692511669 +0000 UTC m=+33.005420586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-mgb5v" (UID: "5cf5a2ef-2498-40a0-a189-0753076fd3b6") : secret "marketplace-operator-metrics" not found Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: E0308 00:21:40.692567 7479 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:40.692589 master-0 kubenswrapper[7479]: I0308 00:21:40.692517 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:40.692839 master-0 kubenswrapper[7479]: E0308 00:21:40.692579 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert podName:4ad37f40-c533-4a1e-882a-2e0973eff86d nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.692568561 +0000 UTC m=+33.005477478 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert") pod "olm-operator-d64cfc9db-8qtmf" (UID: "4ad37f40-c533-4a1e-882a-2e0973eff86d") : secret "olm-operator-serving-cert" not found Mar 08 00:21:40.692839 master-0 kubenswrapper[7479]: E0308 00:21:40.692634 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls podName:6d770808-d390-41c1-a9d9-fc12b99fa9a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.692611592 +0000 UTC m=+33.005520559 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-cxs8s" (UID: "6d770808-d390-41c1-a9d9-fc12b99fa9a9") : secret "cluster-monitoring-operator-tls" not found Mar 08 00:21:40.692839 master-0 kubenswrapper[7479]: I0308 00:21:40.692659 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:40.692839 master-0 kubenswrapper[7479]: I0308 00:21:40.692718 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:40.692839 master-0 kubenswrapper[7479]: I0308 00:21:40.692747 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: E0308 00:21:40.692866 7479 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: E0308 00:21:40.692897 7479 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: E0308 00:21:40.692956 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs podName:d7a0bdcc-92f5-41e6-ab47-ee48a5788bac nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.692934743 +0000 UTC m=+33.005843670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs") pod "multus-admission-controller-8d675b596-jgdmb" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac") : secret "multus-admission-controller-secret" not found Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: E0308 00:21:40.692976 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs podName:815fd565-0609-4d8f-ac05-8656f198b008 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.692967204 +0000 UTC m=+33.005876131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs") pod "network-metrics-daemon-krv7c" (UID: "815fd565-0609-4d8f-ac05-8656f198b008") : secret "metrics-daemon-secret" not found Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: I0308 00:21:40.693002 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:40.693110 master-0 kubenswrapper[7479]: I0308 00:21:40.693070 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:40.693364 master-0 kubenswrapper[7479]: E0308 00:21:40.693145 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 00:21:40.693364 master-0 kubenswrapper[7479]: E0308 00:21:40.693153 7479 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 00:21:40.693364 master-0 kubenswrapper[7479]: E0308 00:21:40.693239 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert podName:b94acad3-cf4e-443d-80fb-5e68a4074336 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.693196741 +0000 UTC m=+33.006105658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert") pod "catalog-operator-7d9c49f57b-8jr6f" (UID: "b94acad3-cf4e-443d-80fb-5e68a4074336") : secret "catalog-operator-serving-cert" not found Mar 08 00:21:40.693364 master-0 kubenswrapper[7479]: E0308 00:21:40.693255 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert podName:8f71fd39-a16b-47d2-b781-c8ce37bcb9b2 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:56.693249233 +0000 UTC m=+33.006158150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-phgxj" (UID: "8f71fd39-a16b-47d2-b781-c8ce37bcb9b2") : secret "package-server-manager-serving-cert" not found Mar 08 00:21:40.695723 master-0 kubenswrapper[7479]: I0308 00:21:40.695672 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:40.695996 master-0 kubenswrapper[7479]: I0308 00:21:40.695959 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:40.751179 master-0 kubenswrapper[7479]: I0308 00:21:40.751126 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:21:40.752900 master-0 kubenswrapper[7479]: I0308 00:21:40.751888 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:21:40.752900 master-0 kubenswrapper[7479]: I0308 00:21:40.752455 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:21:40.754477 master-0 kubenswrapper[7479]: I0308 00:21:40.754429 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:21:40.784997 master-0 kubenswrapper[7479]: I0308 00:21:40.784941 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:21:40.996441 master-0 kubenswrapper[7479]: I0308 00:21:40.996391 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:40.996691 master-0 kubenswrapper[7479]: I0308 00:21:40.996444 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:40.996691 master-0 kubenswrapper[7479]: E0308 00:21:40.996608 7479 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 00:21:40.996691 master-0 kubenswrapper[7479]: E0308 00:21:40.996684 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:48.996664557 +0000 UTC m=+25.309573574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : secret "serving-cert" not found Mar 08 00:21:40.996860 master-0 kubenswrapper[7479]: E0308 00:21:40.996703 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:40.996860 master-0 kubenswrapper[7479]: E0308 00:21:40.996742 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:48.996729829 +0000 UTC m=+25.309638736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:41.116886 master-0 kubenswrapper[7479]: W0308 00:21:41.116835 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c19760_2cb2_4690_be8e_cba3c517c60e.slice/crio-cbe80ab488a27b71936b88f11fbebbeb1bad4f97f15ed93df41d4a1b48940bdd WatchSource:0}: Error finding container cbe80ab488a27b71936b88f11fbebbeb1bad4f97f15ed93df41d4a1b48940bdd: Status 404 returned error can't find the container with id cbe80ab488a27b71936b88f11fbebbeb1bad4f97f15ed93df41d4a1b48940bdd Mar 08 00:21:41.350100 master-0 kubenswrapper[7479]: I0308 00:21:41.347364 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7775b8f858-tgbrj"] Mar 08 00:21:41.350100 master-0 kubenswrapper[7479]: E0308 00:21:41.347898 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" podUID="5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" Mar 08 00:21:41.474229 master-0 kubenswrapper[7479]: I0308 00:21:41.473587 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2"] Mar 08 00:21:41.595866 master-0 kubenswrapper[7479]: I0308 00:21:41.593734 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-blw5x"] Mar 08 00:21:41.651148 master-0 kubenswrapper[7479]: I0308 00:21:41.650255 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9"] Mar 08 00:21:41.651148 master-0 kubenswrapper[7479]: I0308 00:21:41.650530 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-gmvnl"] Mar 08 00:21:41.669676 master-0 kubenswrapper[7479]: W0308 00:21:41.669640 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f4bafb_c270_428a_bacf_8a424b3d1a05.slice/crio-4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07 WatchSource:0}: Error finding container 4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07: Status 404 returned error can't find the container with id 4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07 Mar 08 00:21:41.705234 master-0 kubenswrapper[7479]: I0308 00:21:41.703311 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-65677d845c-495g9"] Mar 08 00:21:41.705234 master-0 kubenswrapper[7479]: I0308 00:21:41.704034 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711043 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711355 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711502 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711647 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711786 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.711973 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.712091 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 08 00:21:41.714232 master-0 kubenswrapper[7479]: I0308 00:21:41.712221 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:21:41.727228 master-0 kubenswrapper[7479]: I0308 00:21:41.717033 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:21:41.727228 master-0 kubenswrapper[7479]: I0308 00:21:41.717976 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:21:41.731978 master-0 kubenswrapper[7479]: I0308 00:21:41.731854 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-65677d845c-495g9"] Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807062 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807120 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807154 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807331 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807395 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgch\" (UniqueName: \"kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807430 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807453 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807469 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807512 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807555 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.811133 master-0 kubenswrapper[7479]: I0308 00:21:41.807579 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909500 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909575 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909598 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgch\" (UniqueName: \"kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909631 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909651 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909666 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909690 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909723 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909737 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909755 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.909771 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.910551 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.911578 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.912530 master-0 kubenswrapper[7479]: I0308 00:21:41.911622 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.913486 master-0 kubenswrapper[7479]: E0308 00:21:41.913457 7479 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 00:21:41.913538 master-0 kubenswrapper[7479]: E0308 00:21:41.913515 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:42.413502285 +0000 UTC m=+18.726411202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : configmap "audit-0" not found Mar 08 00:21:41.913932 master-0 kubenswrapper[7479]: I0308 00:21:41.913905 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.913974 master-0 kubenswrapper[7479]: I0308 00:21:41.913955 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.914288 master-0 kubenswrapper[7479]: I0308 00:21:41.914249 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.918095 master-0 kubenswrapper[7479]: E0308 00:21:41.914425 7479 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 00:21:41.918095 master-0 kubenswrapper[7479]: E0308 00:21:41.914568 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:42.414534289 +0000 UTC m=+18.727443206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : secret "serving-cert" not found Mar 08 00:21:41.920728 master-0 kubenswrapper[7479]: I0308 00:21:41.919102 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.920728 master-0 kubenswrapper[7479]: I0308 00:21:41.920561 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:41.950421 master-0 kubenswrapper[7479]: I0308 00:21:41.949909 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgch\" (UniqueName: \"kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:42.119890 master-0 kubenswrapper[7479]: I0308 00:21:42.119829 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" event={"ID":"6999cf38-e317-4727-98c9-d4e348e9e16a","Type":"ContainerStarted","Data":"11fc2d0ea92ac8231758b019e771de66de17673da31d79a4aab6fc0b796373e6"} Mar 08 00:21:42.120825 master-0 kubenswrapper[7479]: I0308 00:21:42.120792 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"9da3ea5c4393051eef91cb7af969405949bc3c6b97f5782d6bc10af29a80c30d"} Mar 08 00:21:42.122104 master-0 kubenswrapper[7479]: I0308 00:21:42.121851 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07"} Mar 08 00:21:42.122960 master-0 kubenswrapper[7479]: I0308 00:21:42.122922 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" event={"ID":"32c19760-2cb2-4690-be8e-cba3c517c60e","Type":"ContainerStarted","Data":"cbe80ab488a27b71936b88f11fbebbeb1bad4f97f15ed93df41d4a1b48940bdd"} Mar 08 00:21:42.124066 master-0 kubenswrapper[7479]: I0308 00:21:42.124030 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" event={"ID":"1abf904b-0b8d-4d61-8231-0e8d00933192","Type":"ContainerStarted","Data":"f0660a52e90ffa7a2326892a3e2cda1d66d0d4aba0e60527ee906109c288f588"} Mar 08 00:21:42.124124 master-0 kubenswrapper[7479]: I0308 00:21:42.124083 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:42.130428 master-0 kubenswrapper[7479]: I0308 00:21:42.130379 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:42.214105 master-0 kubenswrapper[7479]: I0308 00:21:42.213470 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjn9j\" (UniqueName: \"kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j\") pod \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " Mar 08 00:21:42.214105 master-0 kubenswrapper[7479]: I0308 00:21:42.213510 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config\") pod \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " Mar 08 00:21:42.214105 master-0 kubenswrapper[7479]: I0308 00:21:42.213540 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert\") pod \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " Mar 08 00:21:42.214105 master-0 kubenswrapper[7479]: I0308 00:21:42.213570 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles\") pod \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\" (UID: \"5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a\") " Mar 08 00:21:42.214932 master-0 kubenswrapper[7479]: I0308 00:21:42.214614 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:42.215101 master-0 kubenswrapper[7479]: I0308 00:21:42.215007 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config" (OuterVolumeSpecName: "config") pod "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:42.315309 master-0 kubenswrapper[7479]: I0308 00:21:42.315234 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:42.315309 master-0 kubenswrapper[7479]: I0308 00:21:42.315264 7479 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:42.416377 master-0 kubenswrapper[7479]: I0308 00:21:42.416331 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:42.416461 master-0 kubenswrapper[7479]: I0308 00:21:42.416389 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:42.416545 master-0 kubenswrapper[7479]: E0308 00:21:42.416478 7479 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 00:21:42.416545 master-0 kubenswrapper[7479]: E0308 00:21:42.416528 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:43.416512001 +0000 UTC m=+19.729420918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : configmap "audit-0" not found Mar 08 00:21:42.416901 master-0 kubenswrapper[7479]: E0308 00:21:42.416864 7479 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 00:21:42.416901 master-0 kubenswrapper[7479]: E0308 00:21:42.416899 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:43.416890653 +0000 UTC m=+19.729799680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : secret "serving-cert" not found Mar 08 00:21:43.425288 master-0 kubenswrapper[7479]: I0308 00:21:43.425210 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:43.425288 master-0 kubenswrapper[7479]: I0308 00:21:43.425272 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:43.426049 master-0 kubenswrapper[7479]: E0308 00:21:43.425410 7479 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 00:21:43.426049 master-0 kubenswrapper[7479]: E0308 00:21:43.425457 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:45.425442387 +0000 UTC m=+21.738351304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : configmap "audit-0" not found Mar 08 00:21:43.426049 master-0 kubenswrapper[7479]: E0308 00:21:43.425529 7479 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 00:21:43.426049 master-0 kubenswrapper[7479]: E0308 00:21:43.425552 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:45.425543571 +0000 UTC m=+21.738452608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : secret "serving-cert" not found Mar 08 00:21:43.712394 master-0 kubenswrapper[7479]: I0308 00:21:43.710908 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:43.722690 master-0 kubenswrapper[7479]: I0308 00:21:43.722604 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j" (OuterVolumeSpecName: "kube-api-access-zjn9j") pod "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" (UID: "5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a"). InnerVolumeSpecName "kube-api-access-zjn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:43.728615 master-0 kubenswrapper[7479]: I0308 00:21:43.728297 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjn9j\" (UniqueName: \"kubernetes.io/projected/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-kube-api-access-zjn9j\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:43.728615 master-0 kubenswrapper[7479]: I0308 00:21:43.728334 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:43.732663 master-0 kubenswrapper[7479]: I0308 00:21:43.732589 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7775b8f858-tgbrj" Mar 08 00:21:43.867118 master-0 kubenswrapper[7479]: I0308 00:21:43.865469 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:21:43.888681 master-0 kubenswrapper[7479]: I0308 00:21:43.883027 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:43.932555 master-0 kubenswrapper[7479]: I0308 00:21:43.932498 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:21:43.932765 master-0 kubenswrapper[7479]: I0308 00:21:43.932747 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:21:43.939280 master-0 kubenswrapper[7479]: I0308 00:21:43.933478 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:43.939280 master-0 kubenswrapper[7479]: I0308 00:21:43.933570 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:21:43.939280 master-0 kubenswrapper[7479]: I0308 00:21:43.933782 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:43.994291 master-0 kubenswrapper[7479]: I0308 00:21:43.994234 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:21:44.003122 master-0 kubenswrapper[7479]: I0308 00:21:44.002072 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7775b8f858-tgbrj"] Mar 08 00:21:44.003122 master-0 kubenswrapper[7479]: I0308 00:21:44.002122 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:21:44.003122 master-0 kubenswrapper[7479]: I0308 00:21:44.002136 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7775b8f858-tgbrj"] Mar 08 00:21:44.003122 master-0 kubenswrapper[7479]: I0308 00:21:44.002150 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:44.003122 master-0 kubenswrapper[7479]: I0308 00:21:44.002748 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:44.014250 master-0 kubenswrapper[7479]: I0308 00:21:44.008567 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.014250 master-0 kubenswrapper[7479]: I0308 00:21:44.012550 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 00:21:44.036099 master-0 kubenswrapper[7479]: I0308 00:21:44.036035 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.036431 master-0 kubenswrapper[7479]: I0308 00:21:44.036400 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.036622 master-0 kubenswrapper[7479]: I0308 00:21:44.036441 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.036622 master-0 kubenswrapper[7479]: I0308 00:21:44.036588 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.036702 master-0 kubenswrapper[7479]: I0308 00:21:44.036627 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88ts\" (UniqueName: \"kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.139059 master-0 kubenswrapper[7479]: I0308 00:21:44.138983 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.139313 master-0 kubenswrapper[7479]: I0308 00:21:44.139074 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.139313 master-0 kubenswrapper[7479]: I0308 00:21:44.139104 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.140278 master-0 kubenswrapper[7479]: I0308 00:21:44.140232 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.140391 master-0 kubenswrapper[7479]: I0308 00:21:44.140279 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88ts\" (UniqueName: \"kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.140391 master-0 kubenswrapper[7479]: I0308 00:21:44.140301 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.140464 master-0 kubenswrapper[7479]: I0308 00:21:44.140404 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.140464 master-0 kubenswrapper[7479]: I0308 00:21:44.140429 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.140542 master-0 kubenswrapper[7479]: I0308 00:21:44.140477 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:44.141649 master-0 kubenswrapper[7479]: E0308 00:21:44.141436 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:44.141649 master-0 kubenswrapper[7479]: E0308 00:21:44.141556 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca podName:5837befc-f6e9-4f74-ae39-d0aec977f0c9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:44.641515836 +0000 UTC m=+20.954424753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca") pod "controller-manager-8597858f97-kb2l8" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9") : configmap "client-ca" not found Mar 08 00:21:44.143193 master-0 kubenswrapper[7479]: I0308 00:21:44.143134 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.143626 master-0 kubenswrapper[7479]: I0308 00:21:44.143590 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.157964 master-0 kubenswrapper[7479]: I0308 00:21:44.157931 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.163539 master-0 kubenswrapper[7479]: I0308 00:21:44.163471 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88ts\" (UniqueName: \"kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.241828 master-0 kubenswrapper[7479]: I0308 00:21:44.241670 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.241828 master-0 kubenswrapper[7479]: I0308 00:21:44.241730 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.242285 master-0 kubenswrapper[7479]: I0308 00:21:44.241827 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.242285 master-0 kubenswrapper[7479]: I0308 00:21:44.242105 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.243028 master-0 kubenswrapper[7479]: I0308 00:21:44.242288 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.651870 master-0 kubenswrapper[7479]: I0308 00:21:44.651556 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:44.652495 master-0 kubenswrapper[7479]: E0308 00:21:44.651683 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:44.652495 master-0 kubenswrapper[7479]: E0308 00:21:44.652002 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca podName:5837befc-f6e9-4f74-ae39-d0aec977f0c9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:45.651984226 +0000 UTC m=+21.964893133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca") pod "controller-manager-8597858f97-kb2l8" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9") : configmap "client-ca" not found Mar 08 00:21:44.738491 master-0 kubenswrapper[7479]: I0308 00:21:44.738270 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:44.742906 master-0 kubenswrapper[7479]: I0308 00:21:44.742872 7479 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2" exitCode=0 Mar 08 00:21:44.742996 master-0 kubenswrapper[7479]: I0308 00:21:44.742907 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2"} Mar 08 00:21:44.744353 master-0 kubenswrapper[7479]: I0308 00:21:44.743492 7479 scope.go:117] "RemoveContainer" containerID="7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2" Mar 08 00:21:44.969006 master-0 kubenswrapper[7479]: I0308 00:21:44.968951 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:45.466501 master-0 kubenswrapper[7479]: I0308 00:21:45.466157 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:45.466716 master-0 kubenswrapper[7479]: I0308 00:21:45.466521 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:45.466716 master-0 kubenswrapper[7479]: E0308 00:21:45.466684 7479 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 00:21:45.466807 master-0 kubenswrapper[7479]: E0308 00:21:45.466738 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:49.466718998 +0000 UTC m=+25.779627915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : configmap "audit-0" not found Mar 08 00:21:45.466858 master-0 kubenswrapper[7479]: E0308 00:21:45.466816 7479 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 00:21:45.466858 master-0 kubenswrapper[7479]: E0308 00:21:45.466844 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:49.466835381 +0000 UTC m=+25.779744298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : secret "serving-cert" not found Mar 08 00:21:45.668356 master-0 kubenswrapper[7479]: I0308 00:21:45.667788 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:45.668356 master-0 kubenswrapper[7479]: E0308 00:21:45.668015 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:45.668356 master-0 kubenswrapper[7479]: E0308 00:21:45.668067 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca podName:5837befc-f6e9-4f74-ae39-d0aec977f0c9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:47.668049956 +0000 UTC m=+23.980958883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca") pod "controller-manager-8597858f97-kb2l8" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9") : configmap "client-ca" not found Mar 08 00:21:45.753898 master-0 kubenswrapper[7479]: I0308 00:21:45.753842 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerStarted","Data":"dc658077d52293b3c4b33ff4dc755cf2b234d7c6150c15f85d599f2e125c3427"} Mar 08 00:21:45.890283 master-0 kubenswrapper[7479]: I0308 00:21:45.890221 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a" path="/var/lib/kubelet/pods/5aacd0e3-eaa0-44b1-ba63-11e6aeb39a0a/volumes" Mar 08 00:21:46.579031 master-0 kubenswrapper[7479]: I0308 00:21:46.577267 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 00:21:46.579031 master-0 kubenswrapper[7479]: I0308 00:21:46.577869 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:46.579031 master-0 kubenswrapper[7479]: I0308 00:21:46.577973 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.584062 master-0 kubenswrapper[7479]: I0308 00:21:46.584036 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 00:21:46.678222 master-0 kubenswrapper[7479]: I0308 00:21:46.678145 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.678808 master-0 kubenswrapper[7479]: I0308 00:21:46.678234 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.678808 master-0 kubenswrapper[7479]: I0308 00:21:46.678394 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.779423 master-0 kubenswrapper[7479]: I0308 00:21:46.779326 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.779684 master-0 kubenswrapper[7479]: I0308 00:21:46.779460 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.779684 master-0 kubenswrapper[7479]: I0308 00:21:46.779508 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.779924 master-0 kubenswrapper[7479]: I0308 00:21:46.779879 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:46.779974 master-0 kubenswrapper[7479]: I0308 00:21:46.779914 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:47.293229 master-0 kubenswrapper[7479]: I0308 00:21:47.291347 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 00:21:47.690123 master-0 kubenswrapper[7479]: I0308 00:21:47.689889 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:47.690123 master-0 kubenswrapper[7479]: E0308 00:21:47.689979 7479 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:47.690123 master-0 kubenswrapper[7479]: E0308 00:21:47.690037 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca podName:5837befc-f6e9-4f74-ae39-d0aec977f0c9 nodeName:}" failed. No retries permitted until 2026-03-08 00:21:51.690022645 +0000 UTC m=+28.002931552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca") pod "controller-manager-8597858f97-kb2l8" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9") : configmap "client-ca" not found Mar 08 00:21:49.023134 master-0 kubenswrapper[7479]: I0308 00:21:49.023041 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:49.024001 master-0 kubenswrapper[7479]: I0308 00:21:49.023143 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:49.024001 master-0 kubenswrapper[7479]: E0308 00:21:49.023441 7479 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 00:21:49.024001 master-0 kubenswrapper[7479]: E0308 00:21:49.023565 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca podName:f6499204-e7f3-45b9-86b0-57fdf35b96a9 nodeName:}" failed. No retries permitted until 2026-03-08 00:22:05.023537159 +0000 UTC m=+41.336446106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca") pod "route-controller-manager-56f6fc54fd-nwfzl" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9") : configmap "client-ca" not found Mar 08 00:21:49.033608 master-0 kubenswrapper[7479]: I0308 00:21:49.033546 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"route-controller-manager-56f6fc54fd-nwfzl\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:49.531029 master-0 kubenswrapper[7479]: I0308 00:21:49.530928 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:49.531381 master-0 kubenswrapper[7479]: I0308 00:21:49.531196 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:49.531381 master-0 kubenswrapper[7479]: E0308 00:21:49.531297 7479 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 00:21:49.531381 master-0 kubenswrapper[7479]: E0308 00:21:49.531379 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit podName:0bf4911c-104d-418f-b42d-3e2db0ef25eb nodeName:}" failed. No retries permitted until 2026-03-08 00:21:57.531356922 +0000 UTC m=+33.844265869 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit") pod "apiserver-65677d845c-495g9" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb") : configmap "audit-0" not found Mar 08 00:21:49.535451 master-0 kubenswrapper[7479]: I0308 00:21:49.535395 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"apiserver-65677d845c-495g9\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:50.561733 master-0 kubenswrapper[7479]: I0308 00:21:50.561681 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:50.801925 master-0 kubenswrapper[7479]: I0308 00:21:50.801840 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 00:21:50.955951 master-0 kubenswrapper[7479]: W0308 00:21:50.955810 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcd9aee54_e935_4841_b135_9000e006b96e.slice/crio-e0cd8a5f26c892bce582feeed10ef56c6636a20e1780173c2a48d551701ad3e7 WatchSource:0}: Error finding container e0cd8a5f26c892bce582feeed10ef56c6636a20e1780173c2a48d551701ad3e7: Status 404 returned error can't find the container with id e0cd8a5f26c892bce582feeed10ef56c6636a20e1780173c2a48d551701ad3e7 Mar 08 00:21:51.189261 master-0 kubenswrapper[7479]: I0308 00:21:51.186377 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:51.338857 master-0 kubenswrapper[7479]: I0308 00:21:51.338813 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-65677d845c-495g9"] Mar 08 00:21:51.339320 master-0 kubenswrapper[7479]: E0308 00:21:51.339047 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-65677d845c-495g9" podUID="0bf4911c-104d-418f-b42d-3e2db0ef25eb" Mar 08 00:21:51.757064 master-0 kubenswrapper[7479]: I0308 00:21:51.756920 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:51.757974 master-0 kubenswrapper[7479]: I0308 00:21:51.757934 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"controller-manager-8597858f97-kb2l8\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:51.777272 master-0 kubenswrapper[7479]: I0308 00:21:51.773335 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"cd9aee54-e935-4841-b135-9000e006b96e","Type":"ContainerStarted","Data":"e0cd8a5f26c892bce582feeed10ef56c6636a20e1780173c2a48d551701ad3e7"} Mar 08 00:21:51.777272 master-0 kubenswrapper[7479]: I0308 00:21:51.773361 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:51.779525 master-0 kubenswrapper[7479]: I0308 00:21:51.779493 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:51.793531 master-0 kubenswrapper[7479]: I0308 00:21:51.793504 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:21:51.959281 master-0 kubenswrapper[7479]: I0308 00:21:51.959232 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959281 master-0 kubenswrapper[7479]: I0308 00:21:51.959273 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959513 master-0 kubenswrapper[7479]: I0308 00:21:51.959303 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgch\" (UniqueName: \"kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959513 master-0 kubenswrapper[7479]: I0308 00:21:51.959330 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959513 master-0 kubenswrapper[7479]: I0308 00:21:51.959351 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959513 master-0 kubenswrapper[7479]: I0308 00:21:51.959415 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:51.959686 master-0 kubenswrapper[7479]: I0308 00:21:51.959643 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.959816 master-0 kubenswrapper[7479]: I0308 00:21:51.959752 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:51.959847 master-0 kubenswrapper[7479]: I0308 00:21:51.959814 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.960061 master-0 kubenswrapper[7479]: I0308 00:21:51.960036 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.960098 master-0 kubenswrapper[7479]: I0308 00:21:51.960074 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.960270 master-0 kubenswrapper[7479]: I0308 00:21:51.960254 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca\") pod \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\" (UID: \"0bf4911c-104d-418f-b42d-3e2db0ef25eb\") " Mar 08 00:21:51.960311 master-0 kubenswrapper[7479]: I0308 00:21:51.960072 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:51.960340 master-0 kubenswrapper[7479]: I0308 00:21:51.960308 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:51.960492 master-0 kubenswrapper[7479]: I0308 00:21:51.960459 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config" (OuterVolumeSpecName: "config") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:51.960688 master-0 kubenswrapper[7479]: I0308 00:21:51.960669 7479 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:51.960717 master-0 kubenswrapper[7479]: I0308 00:21:51.960690 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:51.960717 master-0 kubenswrapper[7479]: I0308 00:21:51.960701 7479 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:51.960717 master-0 kubenswrapper[7479]: I0308 00:21:51.960710 7479 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:51.960794 master-0 kubenswrapper[7479]: I0308 00:21:51.960719 7479 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bf4911c-104d-418f-b42d-3e2db0ef25eb-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:51.960882 master-0 kubenswrapper[7479]: I0308 00:21:51.960838 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:51.962696 master-0 kubenswrapper[7479]: I0308 00:21:51.962667 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:51.963430 master-0 kubenswrapper[7479]: I0308 00:21:51.963396 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:51.964172 master-0 kubenswrapper[7479]: I0308 00:21:51.964141 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch" (OuterVolumeSpecName: "kube-api-access-zsgch") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "kube-api-access-zsgch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:51.968870 master-0 kubenswrapper[7479]: I0308 00:21:51.968837 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0bf4911c-104d-418f-b42d-3e2db0ef25eb" (UID: "0bf4911c-104d-418f-b42d-3e2db0ef25eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:52.080662 master-0 kubenswrapper[7479]: I0308 00:21:52.080545 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.080662 master-0 kubenswrapper[7479]: I0308 00:21:52.080583 7479 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.080662 master-0 kubenswrapper[7479]: I0308 00:21:52.080597 7479 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.080662 master-0 kubenswrapper[7479]: I0308 00:21:52.080608 7479 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bf4911c-104d-418f-b42d-3e2db0ef25eb-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.080662 master-0 kubenswrapper[7479]: I0308 00:21:52.080617 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsgch\" (UniqueName: \"kubernetes.io/projected/0bf4911c-104d-418f-b42d-3e2db0ef25eb-kube-api-access-zsgch\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.312785 master-0 kubenswrapper[7479]: I0308 00:21:52.312712 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w"] Mar 08 00:21:52.313433 master-0 kubenswrapper[7479]: I0308 00:21:52.313321 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.315598 master-0 kubenswrapper[7479]: I0308 00:21:52.315044 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:21:52.315598 master-0 kubenswrapper[7479]: I0308 00:21:52.315331 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:21:52.315760 master-0 kubenswrapper[7479]: I0308 00:21:52.315673 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:21:52.315817 master-0 kubenswrapper[7479]: I0308 00:21:52.315765 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:21:52.315892 master-0 kubenswrapper[7479]: I0308 00:21:52.315840 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:21:52.315957 master-0 kubenswrapper[7479]: I0308 00:21:52.315946 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:21:52.316228 master-0 kubenswrapper[7479]: I0308 00:21:52.316194 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:21:52.316465 master-0 kubenswrapper[7479]: I0308 00:21:52.316429 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:21:52.337414 master-0 kubenswrapper[7479]: I0308 00:21:52.337268 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:52.337414 master-0 kubenswrapper[7479]: I0308 00:21:52.337418 7479 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:21:52.361378 master-0 kubenswrapper[7479]: I0308 00:21:52.361344 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:21:52.385343 master-0 kubenswrapper[7479]: I0308 00:21:52.384446 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w"] Mar 08 00:21:52.442950 master-0 kubenswrapper[7479]: I0308 00:21:52.442900 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:21:52.483895 master-0 kubenswrapper[7479]: I0308 00:21:52.483848 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfg9\" (UniqueName: \"kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484056 master-0 kubenswrapper[7479]: I0308 00:21:52.483931 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484056 master-0 kubenswrapper[7479]: I0308 00:21:52.483961 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484056 master-0 kubenswrapper[7479]: I0308 00:21:52.483977 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484056 master-0 kubenswrapper[7479]: I0308 00:21:52.483995 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484056 master-0 kubenswrapper[7479]: I0308 00:21:52.484012 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484254 master-0 kubenswrapper[7479]: I0308 00:21:52.484064 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.484254 master-0 kubenswrapper[7479]: I0308 00:21:52.484086 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.532375 master-0 kubenswrapper[7479]: I0308 00:21:52.530812 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl"] Mar 08 00:21:52.532375 master-0 kubenswrapper[7479]: E0308 00:21:52.531272 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" podUID="f6499204-e7f3-45b9-86b0-57fdf35b96a9" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585027 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585093 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585110 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585125 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585141 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585176 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585192 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.585253 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfg9\" (UniqueName: \"kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.586748 master-0 kubenswrapper[7479]: I0308 00:21:52.586371 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.587183 master-0 kubenswrapper[7479]: I0308 00:21:52.587045 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.587183 master-0 kubenswrapper[7479]: I0308 00:21:52.587099 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.589850 master-0 kubenswrapper[7479]: I0308 00:21:52.589770 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.592179 master-0 kubenswrapper[7479]: I0308 00:21:52.592144 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.592671 master-0 kubenswrapper[7479]: I0308 00:21:52.592602 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.593117 master-0 kubenswrapper[7479]: I0308 00:21:52.593051 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.604565 master-0 kubenswrapper[7479]: I0308 00:21:52.604528 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfg9\" (UniqueName: \"kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.625405 master-0 kubenswrapper[7479]: I0308 00:21:52.625323 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:21:52.777344 master-0 kubenswrapper[7479]: I0308 00:21:52.777305 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:52.777801 master-0 kubenswrapper[7479]: I0308 00:21:52.777753 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65677d845c-495g9" Mar 08 00:21:52.785456 master-0 kubenswrapper[7479]: I0308 00:21:52.785389 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:52.867471 master-0 kubenswrapper[7479]: I0308 00:21:52.867314 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-65677d845c-495g9"] Mar 08 00:21:52.868905 master-0 kubenswrapper[7479]: I0308 00:21:52.868838 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-85cb8cb9bb-bmx44"] Mar 08 00:21:52.870421 master-0 kubenswrapper[7479]: I0308 00:21:52.869702 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.872488 master-0 kubenswrapper[7479]: I0308 00:21:52.872440 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-65677d845c-495g9"] Mar 08 00:21:52.872488 master-0 kubenswrapper[7479]: I0308 00:21:52.872467 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:21:52.872637 master-0 kubenswrapper[7479]: I0308 00:21:52.872539 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:21:52.873261 master-0 kubenswrapper[7479]: I0308 00:21:52.872784 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:21:52.873261 master-0 kubenswrapper[7479]: I0308 00:21:52.872882 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:21:52.873261 master-0 kubenswrapper[7479]: I0308 00:21:52.873144 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:21:52.873646 master-0 kubenswrapper[7479]: I0308 00:21:52.873361 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:21:52.873646 master-0 kubenswrapper[7479]: I0308 00:21:52.873473 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:21:52.873646 master-0 kubenswrapper[7479]: I0308 00:21:52.873517 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:21:52.873646 master-0 kubenswrapper[7479]: I0308 00:21:52.873569 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.880116 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.912890 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvt7\" (UniqueName: \"kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7\") pod \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913325 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config\") pod \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913364 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") pod \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\" (UID: \"f6499204-e7f3-45b9-86b0-57fdf35b96a9\") " Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913445 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913751 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config" (OuterVolumeSpecName: "config") pod "f6499204-e7f3-45b9-86b0-57fdf35b96a9" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913888 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913912 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913940 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.913972 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914028 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914049 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914078 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qshd\" (UniqueName: \"kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914437 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914461 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914497 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914720 7479 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bf4911c-104d-418f-b42d-3e2db0ef25eb-audit\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.930758 master-0 kubenswrapper[7479]: I0308 00:21:52.914744 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:52.934298 master-0 kubenswrapper[7479]: I0308 00:21:52.934259 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-85cb8cb9bb-bmx44"] Mar 08 00:21:52.934759 master-0 kubenswrapper[7479]: I0308 00:21:52.934680 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6499204-e7f3-45b9-86b0-57fdf35b96a9" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:52.934869 master-0 kubenswrapper[7479]: I0308 00:21:52.934825 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7" (OuterVolumeSpecName: "kube-api-access-xsvt7") pod "f6499204-e7f3-45b9-86b0-57fdf35b96a9" (UID: "f6499204-e7f3-45b9-86b0-57fdf35b96a9"). InnerVolumeSpecName "kube-api-access-xsvt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:53.014988 master-0 kubenswrapper[7479]: I0308 00:21:53.014922 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.015173 master-0 kubenswrapper[7479]: I0308 00:21:53.015112 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.015173 master-0 kubenswrapper[7479]: I0308 00:21:53.015140 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qshd\" (UniqueName: \"kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.015173 master-0 kubenswrapper[7479]: I0308 00:21:53.015162 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.015300 master-0 kubenswrapper[7479]: I0308 00:21:53.015178 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015388 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015478 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015529 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015560 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015606 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015645 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015724 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6499204-e7f3-45b9-86b0-57fdf35b96a9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015743 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvt7\" (UniqueName: \"kubernetes.io/projected/f6499204-e7f3-45b9-86b0-57fdf35b96a9-kube-api-access-xsvt7\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015766 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015940 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016021 master-0 kubenswrapper[7479]: I0308 00:21:53.015939 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016426 master-0 kubenswrapper[7479]: I0308 00:21:53.016101 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016426 master-0 kubenswrapper[7479]: I0308 00:21:53.016324 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016487 master-0 kubenswrapper[7479]: I0308 00:21:53.016424 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.016883 master-0 kubenswrapper[7479]: I0308 00:21:53.016803 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.018320 master-0 kubenswrapper[7479]: I0308 00:21:53.018286 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.018766 master-0 kubenswrapper[7479]: I0308 00:21:53.018730 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.018818 master-0 kubenswrapper[7479]: I0308 00:21:53.018777 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.081904 master-0 kubenswrapper[7479]: I0308 00:21:53.081536 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qshd\" (UniqueName: \"kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.252553 master-0 kubenswrapper[7479]: I0308 00:21:53.252494 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:21:53.687654 master-0 kubenswrapper[7479]: I0308 00:21:53.686182 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:21:53.687654 master-0 kubenswrapper[7479]: I0308 00:21:53.686791 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.699097 master-0 kubenswrapper[7479]: I0308 00:21:53.699058 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:21:53.779583 master-0 kubenswrapper[7479]: I0308 00:21:53.779532 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl" Mar 08 00:21:53.824406 master-0 kubenswrapper[7479]: I0308 00:21:53.823912 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.824406 master-0 kubenswrapper[7479]: I0308 00:21:53.823994 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.824406 master-0 kubenswrapper[7479]: I0308 00:21:53.824100 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.891011 master-0 kubenswrapper[7479]: I0308 00:21:53.890775 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf4911c-104d-418f-b42d-3e2db0ef25eb" path="/var/lib/kubelet/pods/0bf4911c-104d-418f-b42d-3e2db0ef25eb/volumes" Mar 08 00:21:53.924880 master-0 kubenswrapper[7479]: I0308 00:21:53.924798 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.925087 master-0 kubenswrapper[7479]: I0308 00:21:53.924905 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.925087 master-0 kubenswrapper[7479]: I0308 00:21:53.924965 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.925087 master-0 kubenswrapper[7479]: I0308 00:21:53.925023 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:53.925296 master-0 kubenswrapper[7479]: I0308 00:21:53.925256 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:54.251708 master-0 kubenswrapper[7479]: I0308 00:21:54.251608 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:54.307327 master-0 kubenswrapper[7479]: I0308 00:21:54.307244 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:21:54.532375 master-0 kubenswrapper[7479]: I0308 00:21:54.532170 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl"] Mar 08 00:21:54.542910 master-0 kubenswrapper[7479]: I0308 00:21:54.542841 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f6fc54fd-nwfzl"] Mar 08 00:21:54.639452 master-0 kubenswrapper[7479]: I0308 00:21:54.639403 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6499204-e7f3-45b9-86b0-57fdf35b96a9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:55.658565 master-0 kubenswrapper[7479]: I0308 00:21:55.652787 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:21:55.686559 master-0 kubenswrapper[7479]: I0308 00:21:55.686530 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 00:21:55.704061 master-0 kubenswrapper[7479]: W0308 00:21:55.704031 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4217b755_ca87_45cf_9e52_7b2681660f41.slice/crio-dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b WatchSource:0}: Error finding container dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b: Status 404 returned error can't find the container with id dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b Mar 08 00:21:55.733828 master-0 kubenswrapper[7479]: I0308 00:21:55.733788 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:21:55.803364 master-0 kubenswrapper[7479]: I0308 00:21:55.801061 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"c1157264-0054-491e-bc65-daf626fc041a","Type":"ContainerStarted","Data":"ed2624edd1b0bdd5361a9bb140b7d43812ceeacf4dbac6ce6049853d2e2f0be3"} Mar 08 00:21:55.811567 master-0 kubenswrapper[7479]: I0308 00:21:55.811453 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:21:55.812130 master-0 kubenswrapper[7479]: I0308 00:21:55.812036 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.831145 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.831308 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.831351 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.831448 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.831636 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.833945 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.833980 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" event={"ID":"1abf904b-0b8d-4d61-8231-0e8d00933192","Type":"ContainerStarted","Data":"2c5b4b85a0fefd0a92422962c5268cff179cebaa65136415c92edb7d7a36490e"} Mar 08 00:21:55.836558 master-0 kubenswrapper[7479]: I0308 00:21:55.834172 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-85cb8cb9bb-bmx44"] Mar 08 00:21:55.843966 master-0 kubenswrapper[7479]: I0308 00:21:55.843750 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" event={"ID":"6999cf38-e317-4727-98c9-d4e348e9e16a","Type":"ContainerStarted","Data":"4b93ca0ef506b0c02846ca33f17d63f5a824052f00f7d19371fbf7e2b8abc456"} Mar 08 00:21:55.857800 master-0 kubenswrapper[7479]: I0308 00:21:55.856963 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" event={"ID":"5837befc-f6e9-4f74-ae39-d0aec977f0c9","Type":"ContainerStarted","Data":"4e7d3332a4fd54ae4a295945e29cf38ecb93886fa54a4b5efb0807b00cced883"} Mar 08 00:21:55.857800 master-0 kubenswrapper[7479]: I0308 00:21:55.857106 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w"] Mar 08 00:21:55.860617 master-0 kubenswrapper[7479]: I0308 00:21:55.858239 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkjm\" (UniqueName: \"kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.860617 master-0 kubenswrapper[7479]: I0308 00:21:55.858287 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.860617 master-0 kubenswrapper[7479]: I0308 00:21:55.858338 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.860617 master-0 kubenswrapper[7479]: I0308 00:21:55.858495 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.860617 master-0 kubenswrapper[7479]: I0308 00:21:55.859339 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" event={"ID":"32c19760-2cb2-4690-be8e-cba3c517c60e","Type":"ContainerStarted","Data":"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876"} Mar 08 00:21:55.921801 master-0 kubenswrapper[7479]: I0308 00:21:55.920897 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6499204-e7f3-45b9-86b0-57fdf35b96a9" path="/var/lib/kubelet/pods/f6499204-e7f3-45b9-86b0-57fdf35b96a9/volumes" Mar 08 00:21:55.921801 master-0 kubenswrapper[7479]: I0308 00:21:55.921138 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerStarted","Data":"dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b"} Mar 08 00:21:55.961660 master-0 kubenswrapper[7479]: I0308 00:21:55.960240 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkjm\" (UniqueName: \"kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.961660 master-0 kubenswrapper[7479]: I0308 00:21:55.960281 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.961660 master-0 kubenswrapper[7479]: I0308 00:21:55.960314 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.961660 master-0 kubenswrapper[7479]: I0308 00:21:55.960423 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.964253 master-0 kubenswrapper[7479]: I0308 00:21:55.962696 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.964253 master-0 kubenswrapper[7479]: I0308 00:21:55.963185 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:55.978194 master-0 kubenswrapper[7479]: I0308 00:21:55.977643 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:56.034991 master-0 kubenswrapper[7479]: I0308 00:21:56.034949 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkjm\" (UniqueName: \"kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm\") pod \"route-controller-manager-5d7d75cbb9-lf8cw\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:56.131313 master-0 kubenswrapper[7479]: I0308 00:21:56.130297 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-67jx5"] Mar 08 00:21:56.131313 master-0 kubenswrapper[7479]: I0308 00:21:56.130809 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164735 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164792 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164814 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqn9\" (UniqueName: \"kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164864 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164891 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164938 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.164986 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165109 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165151 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165177 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165192 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165270 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165303 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.165322 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.170224 master-0 kubenswrapper[7479]: I0308 00:21:56.169740 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:21:56.266222 master-0 kubenswrapper[7479]: I0308 00:21:56.266157 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266222 master-0 kubenswrapper[7479]: I0308 00:21:56.266215 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266401 master-0 kubenswrapper[7479]: I0308 00:21:56.266237 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqn9\" (UniqueName: \"kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266401 master-0 kubenswrapper[7479]: I0308 00:21:56.266265 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266401 master-0 kubenswrapper[7479]: I0308 00:21:56.266288 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266401 master-0 kubenswrapper[7479]: I0308 00:21:56.266343 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266401 master-0 kubenswrapper[7479]: I0308 00:21:56.266388 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.266551 master-0 kubenswrapper[7479]: I0308 00:21:56.266486 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.266587 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.266612 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.266830 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.266927 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.266987 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267018 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267050 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267164 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267262 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267285 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267308 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267323 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267344 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267414 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267469 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.267528 master-0 kubenswrapper[7479]: I0308 00:21:56.267492 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.268363 master-0 kubenswrapper[7479]: I0308 00:21:56.268314 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.270265 master-0 kubenswrapper[7479]: I0308 00:21:56.270236 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.271217 master-0 kubenswrapper[7479]: I0308 00:21:56.271157 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.284307 master-0 kubenswrapper[7479]: I0308 00:21:56.284268 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqn9\" (UniqueName: \"kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.424844 master-0 kubenswrapper[7479]: I0308 00:21:56.424482 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:21:56.455474 master-0 kubenswrapper[7479]: I0308 00:21:56.455423 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:21:56.665457 master-0 kubenswrapper[7479]: I0308 00:21:56.662581 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jfjzg"] Mar 08 00:21:56.665457 master-0 kubenswrapper[7479]: I0308 00:21:56.663227 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.729759 master-0 kubenswrapper[7479]: I0308 00:21:56.729510 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:21:56.729898 master-0 kubenswrapper[7479]: I0308 00:21:56.729788 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:56.729898 master-0 kubenswrapper[7479]: I0308 00:21:56.729837 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:56.729898 master-0 kubenswrapper[7479]: I0308 00:21:56.729871 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:56.729898 master-0 kubenswrapper[7479]: I0308 00:21:56.729888 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:56.730015 master-0 kubenswrapper[7479]: I0308 00:21:56.729905 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:56.730015 master-0 kubenswrapper[7479]: I0308 00:21:56.729927 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:56.730015 master-0 kubenswrapper[7479]: I0308 00:21:56.729957 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:21:56.744243 master-0 kubenswrapper[7479]: I0308 00:21:56.730141 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:21:56.744243 master-0 kubenswrapper[7479]: I0308 00:21:56.730379 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:21:56.744243 master-0 kubenswrapper[7479]: I0308 00:21:56.729964 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:56.744243 master-0 kubenswrapper[7479]: I0308 00:21:56.742926 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:56.744858 master-0 kubenswrapper[7479]: I0308 00:21:56.744627 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:56.744858 master-0 kubenswrapper[7479]: I0308 00:21:56.744672 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jfjzg"] Mar 08 00:21:56.752224 master-0 kubenswrapper[7479]: I0308 00:21:56.750870 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:56.752224 master-0 kubenswrapper[7479]: I0308 00:21:56.751652 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:56.759169 master-0 kubenswrapper[7479]: I0308 00:21:56.758364 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:56.759169 master-0 kubenswrapper[7479]: I0308 00:21:56.758900 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:56.762081 master-0 kubenswrapper[7479]: I0308 00:21:56.761985 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"multus-admission-controller-8d675b596-jgdmb\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:56.832637 master-0 kubenswrapper[7479]: I0308 00:21:56.832461 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.832637 master-0 kubenswrapper[7479]: I0308 00:21:56.832538 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmgb\" (UniqueName: \"kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.832637 master-0 kubenswrapper[7479]: I0308 00:21:56.832559 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.864603 master-0 kubenswrapper[7479]: I0308 00:21:56.863864 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:21:56.902857 master-0 kubenswrapper[7479]: I0308 00:21:56.902187 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_cd9aee54-e935-4841-b135-9000e006b96e/installer/0.log" Mar 08 00:21:56.903054 master-0 kubenswrapper[7479]: I0308 00:21:56.902852 7479 generic.go:334] "Generic (PLEG): container finished" podID="cd9aee54-e935-4841-b135-9000e006b96e" containerID="9cedbb7466d0d74dc0915aa19c3ce7a37130fdeeffc699adec01959c08687cf7" exitCode=1 Mar 08 00:21:56.905442 master-0 kubenswrapper[7479]: I0308 00:21:56.905285 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"cd9aee54-e935-4841-b135-9000e006b96e","Type":"ContainerDied","Data":"9cedbb7466d0d74dc0915aa19c3ce7a37130fdeeffc699adec01959c08687cf7"} Mar 08 00:21:56.927720 master-0 kubenswrapper[7479]: I0308 00:21:56.927642 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerStarted","Data":"e21ecaa295b51fd30f3e30feccdaaffb5d26d81a05305635fb9f903bb9b8a90e"} Mar 08 00:21:56.944378 master-0 kubenswrapper[7479]: I0308 00:21:56.941849 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmgb\" (UniqueName: \"kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.944378 master-0 kubenswrapper[7479]: I0308 00:21:56.941895 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.944378 master-0 kubenswrapper[7479]: I0308 00:21:56.941960 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.944378 master-0 kubenswrapper[7479]: E0308 00:21:56.942131 7479 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 08 00:21:56.944378 master-0 kubenswrapper[7479]: I0308 00:21:56.944257 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.944798 master-0 kubenswrapper[7479]: I0308 00:21:56.944391 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"c1157264-0054-491e-bc65-daf626fc041a","Type":"ContainerStarted","Data":"09ab7ef94ac35a92c2c10eadb681e5e8b177bf4cd4b8eb49914c79b43cd59144"} Mar 08 00:21:56.945508 master-0 kubenswrapper[7479]: E0308 00:21:56.945303 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls podName:e302bc0b-7560-4f84-813f-d966c2dbe47c nodeName:}" failed. No retries permitted until 2026-03-08 00:21:57.442168643 +0000 UTC m=+33.755077560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls") pod "dns-default-jfjzg" (UID: "e302bc0b-7560-4f84-813f-d966c2dbe47c") : secret "dns-default-metrics-tls" not found Mar 08 00:21:56.959696 master-0 kubenswrapper[7479]: I0308 00:21:56.959656 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" event={"ID":"426e2fcf-dfb6-4193-91ae-c6daef6e50b1","Type":"ContainerStarted","Data":"1820ddcea1c4ba97ee92f4393008b3454d248224a8b8de608f40514e7782286d"} Mar 08 00:21:56.961024 master-0 kubenswrapper[7479]: I0308 00:21:56.961008 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" event={"ID":"401bbef2-684c-4f55-b2c7-e6184c789e40","Type":"ContainerStarted","Data":"5c2b1421622aa51b1e3f3309e1cecee04d47b8ec5a2290e918d8137ddcf8b78c"} Mar 08 00:21:56.961110 master-0 kubenswrapper[7479]: I0308 00:21:56.961098 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" event={"ID":"401bbef2-684c-4f55-b2c7-e6184c789e40","Type":"ContainerStarted","Data":"62a62c397b340be942f32a53629ca1820e5ed2199aae4350c1b9148fffbcc52d"} Mar 08 00:21:56.963076 master-0 kubenswrapper[7479]: I0308 00:21:56.963058 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"f54577a28417110d9e7f61afc4cf54e4382b8b583a37c474d5a4196b61d34559"} Mar 08 00:21:56.963229 master-0 kubenswrapper[7479]: I0308 00:21:56.963190 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"82710a6421f9ccb619f042e68b9675e392f987444180b3a6a9731863e8381221"} Mar 08 00:21:56.967601 master-0 kubenswrapper[7479]: I0308 00:21:56.967550 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"5993f0db8eb571541ffd45db324c8f25d80729c838e2d7b2910b9b88c3eb3de6"} Mar 08 00:21:56.968430 master-0 kubenswrapper[7479]: I0308 00:21:56.967821 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmgb\" (UniqueName: \"kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:56.968707 master-0 kubenswrapper[7479]: I0308 00:21:56.968682 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:21:56.968853 master-0 kubenswrapper[7479]: I0308 00:21:56.968777 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=3.968767155 podStartE2EDuration="3.968767155s" podCreationTimestamp="2026-03-08 00:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:56.967307597 +0000 UTC m=+33.280216514" watchObservedRunningTime="2026-03-08 00:21:56.968767155 +0000 UTC m=+33.281676072" Mar 08 00:21:56.973377 master-0 kubenswrapper[7479]: I0308 00:21:56.973282 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerStarted","Data":"6c847624822fb2ae11b6027b5155999eb848a04181b2d105ba183b9e9a68d9b4"} Mar 08 00:21:56.973829 master-0 kubenswrapper[7479]: I0308 00:21:56.973812 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:21:56.974233 master-0 kubenswrapper[7479]: I0308 00:21:56.974193 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:21:56.975419 master-0 kubenswrapper[7479]: I0308 00:21:56.975390 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:21:56.983842 master-0 kubenswrapper[7479]: I0308 00:21:56.983816 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:21:56.998854 master-0 kubenswrapper[7479]: I0308 00:21:56.998796 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"01f4711968edd90a03ce566521bccad3babf877143c30f69324972ce8a8bc2ae"} Mar 08 00:21:56.998854 master-0 kubenswrapper[7479]: I0308 00:21:56.998848 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"2c10faa546580f627c778e91e9b7663017d55077528cad866312878aae39b47a"} Mar 08 00:21:56.999595 master-0 kubenswrapper[7479]: I0308 00:21:56.999577 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:21:56.999912 master-0 kubenswrapper[7479]: I0308 00:21:56.999899 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:21:57.036968 master-0 kubenswrapper[7479]: I0308 00:21:57.035492 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" podStartSLOduration=1.035472261 podStartE2EDuration="1.035472261s" podCreationTimestamp="2026-03-08 00:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:57.034570642 +0000 UTC m=+33.347479559" watchObservedRunningTime="2026-03-08 00:21:57.035472261 +0000 UTC m=+33.348381178" Mar 08 00:21:57.079257 master-0 kubenswrapper[7479]: I0308 00:21:57.076584 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=12.076570818 podStartE2EDuration="12.076570818s" podCreationTimestamp="2026-03-08 00:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:57.076114933 +0000 UTC m=+33.389023850" watchObservedRunningTime="2026-03-08 00:21:57.076570818 +0000 UTC m=+33.389479735" Mar 08 00:21:57.160779 master-0 kubenswrapper[7479]: I0308 00:21:57.147786 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l9pkr"] Mar 08 00:21:57.160779 master-0 kubenswrapper[7479]: I0308 00:21:57.148343 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.249284 master-0 kubenswrapper[7479]: I0308 00:21:57.247837 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.249284 master-0 kubenswrapper[7479]: I0308 00:21:57.247883 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkh52\" (UniqueName: \"kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.255119 master-0 kubenswrapper[7479]: I0308 00:21:57.253006 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_cd9aee54-e935-4841-b135-9000e006b96e/installer/0.log" Mar 08 00:21:57.255119 master-0 kubenswrapper[7479]: I0308 00:21:57.253069 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:57.351858 master-0 kubenswrapper[7479]: I0308 00:21:57.348982 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access\") pod \"cd9aee54-e935-4841-b135-9000e006b96e\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " Mar 08 00:21:57.351858 master-0 kubenswrapper[7479]: I0308 00:21:57.351775 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock\") pod \"cd9aee54-e935-4841-b135-9000e006b96e\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " Mar 08 00:21:57.351858 master-0 kubenswrapper[7479]: I0308 00:21:57.351812 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir\") pod \"cd9aee54-e935-4841-b135-9000e006b96e\" (UID: \"cd9aee54-e935-4841-b135-9000e006b96e\") " Mar 08 00:21:57.354642 master-0 kubenswrapper[7479]: I0308 00:21:57.353218 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cd9aee54-e935-4841-b135-9000e006b96e" (UID: "cd9aee54-e935-4841-b135-9000e006b96e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:57.354642 master-0 kubenswrapper[7479]: I0308 00:21:57.353499 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.354642 master-0 kubenswrapper[7479]: I0308 00:21:57.353554 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkh52\" (UniqueName: \"kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.354642 master-0 kubenswrapper[7479]: I0308 00:21:57.353618 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:57.354642 master-0 kubenswrapper[7479]: I0308 00:21:57.354380 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf"] Mar 08 00:21:57.355339 master-0 kubenswrapper[7479]: I0308 00:21:57.355293 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock" (OuterVolumeSpecName: "var-lock") pod "cd9aee54-e935-4841-b135-9000e006b96e" (UID: "cd9aee54-e935-4841-b135-9000e006b96e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:57.356987 master-0 kubenswrapper[7479]: I0308 00:21:57.355874 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.372121 master-0 kubenswrapper[7479]: I0308 00:21:57.369368 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f"] Mar 08 00:21:57.372643 master-0 kubenswrapper[7479]: I0308 00:21:57.372572 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkh52\" (UniqueName: \"kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.400097 master-0 kubenswrapper[7479]: I0308 00:21:57.396755 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cd9aee54-e935-4841-b135-9000e006b96e" (UID: "cd9aee54-e935-4841-b135-9000e006b96e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:57.447024 master-0 kubenswrapper[7479]: I0308 00:21:57.446989 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s"] Mar 08 00:21:57.449266 master-0 kubenswrapper[7479]: I0308 00:21:57.448347 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj"] Mar 08 00:21:57.455177 master-0 kubenswrapper[7479]: I0308 00:21:57.455139 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:57.456869 master-0 kubenswrapper[7479]: I0308 00:21:57.456846 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd9aee54-e935-4841-b135-9000e006b96e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:57.456869 master-0 kubenswrapper[7479]: I0308 00:21:57.456869 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd9aee54-e935-4841-b135-9000e006b96e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:21:57.465244 master-0 kubenswrapper[7479]: I0308 00:21:57.465113 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:57.499661 master-0 kubenswrapper[7479]: I0308 00:21:57.488017 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:21:57.521095 master-0 kubenswrapper[7479]: I0308 00:21:57.519553 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:21:57.521095 master-0 kubenswrapper[7479]: I0308 00:21:57.521002 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v"] Mar 08 00:21:57.531538 master-0 kubenswrapper[7479]: W0308 00:21:57.530330 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a0bdcc_92f5_41e6_ab47_ee48a5788bac.slice/crio-85f5347214316bafcae54d5f353ea4dd103edcad8e44bd59e26d7ef740d7221a WatchSource:0}: Error finding container 85f5347214316bafcae54d5f353ea4dd103edcad8e44bd59e26d7ef740d7221a: Status 404 returned error can't find the container with id 85f5347214316bafcae54d5f353ea4dd103edcad8e44bd59e26d7ef740d7221a Mar 08 00:21:57.549662 master-0 kubenswrapper[7479]: I0308 00:21:57.549562 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-krv7c"] Mar 08 00:21:57.573142 master-0 kubenswrapper[7479]: W0308 00:21:57.573112 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bad9e63_1aa2_44a7_aaf8_a0e82f33ad6e.slice/crio-f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d WatchSource:0}: Error finding container f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d: Status 404 returned error can't find the container with id f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d Mar 08 00:21:57.671647 master-0 kubenswrapper[7479]: I0308 00:21:57.671606 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:21:57.902572 master-0 kubenswrapper[7479]: I0308 00:21:57.902269 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jfjzg"] Mar 08 00:21:57.925388 master-0 kubenswrapper[7479]: W0308 00:21:57.925335 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode302bc0b_7560_4f84_813f_d966c2dbe47c.slice/crio-4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36 WatchSource:0}: Error finding container 4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36: Status 404 returned error can't find the container with id 4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36 Mar 08 00:21:58.014307 master-0 kubenswrapper[7479]: I0308 00:21:58.008281 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" event={"ID":"b94acad3-cf4e-443d-80fb-5e68a4074336","Type":"ContainerStarted","Data":"cd06e32b994481471c1008a22765ea8fb7d4c0eac4c1085f974725068e466db7"} Mar 08 00:21:58.014307 master-0 kubenswrapper[7479]: I0308 00:21:58.011669 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" event={"ID":"4ad37f40-c533-4a1e-882a-2e0973eff86d","Type":"ContainerStarted","Data":"036c8d5e00b57ec77b752ae2bc46eb3d7ff2904d9ebc488665656ab787ecd5a5"} Mar 08 00:21:58.014307 master-0 kubenswrapper[7479]: I0308 00:21:58.012844 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36"} Mar 08 00:21:58.023226 master-0 kubenswrapper[7479]: I0308 00:21:58.016538 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_cd9aee54-e935-4841-b135-9000e006b96e/installer/0.log" Mar 08 00:21:58.023226 master-0 kubenswrapper[7479]: I0308 00:21:58.016626 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"cd9aee54-e935-4841-b135-9000e006b96e","Type":"ContainerDied","Data":"e0cd8a5f26c892bce582feeed10ef56c6636a20e1780173c2a48d551701ad3e7"} Mar 08 00:21:58.023226 master-0 kubenswrapper[7479]: I0308 00:21:58.016682 7479 scope.go:117] "RemoveContainer" containerID="9cedbb7466d0d74dc0915aa19c3ce7a37130fdeeffc699adec01959c08687cf7" Mar 08 00:21:58.023226 master-0 kubenswrapper[7479]: I0308 00:21:58.016786 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 00:21:58.027433 master-0 kubenswrapper[7479]: I0308 00:21:58.023628 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerStarted","Data":"85f5347214316bafcae54d5f353ea4dd103edcad8e44bd59e26d7ef740d7221a"} Mar 08 00:21:58.027433 master-0 kubenswrapper[7479]: I0308 00:21:58.024966 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" event={"ID":"6d770808-d390-41c1-a9d9-fc12b99fa9a9","Type":"ContainerStarted","Data":"c0511cfa10b44562c51d17ac29eccf8315f318be9fcd77f37c978f1bbeeb8000"} Mar 08 00:21:58.027433 master-0 kubenswrapper[7479]: I0308 00:21:58.026862 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"a2af0127ad556015336cd256817276cc9d6a8a08dbbf295a1bf7821d7309d19c"} Mar 08 00:21:58.027829 master-0 kubenswrapper[7479]: I0308 00:21:58.027797 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"874da80b3858b9b5a8a2258c3b83f19f5f0c80010ec82d07a7dc18d61c4292fa"} Mar 08 00:21:58.030628 master-0 kubenswrapper[7479]: I0308 00:21:58.030586 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"624e0a9861955168af6025f0fb5bf70d719c984169b8149f4ff044bbd9836cbd"} Mar 08 00:21:58.030628 master-0 kubenswrapper[7479]: I0308 00:21:58.030618 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"2e47d8d2ffbca29135c63c0ec58db9d105e81fa73da896958637e9f0815629eb"} Mar 08 00:21:58.033470 master-0 kubenswrapper[7479]: I0308 00:21:58.033418 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l9pkr" event={"ID":"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e","Type":"ContainerStarted","Data":"3605042a4617c2c40734abc105f258640f2a7e54de619be892818b141fe2a62d"} Mar 08 00:21:58.033569 master-0 kubenswrapper[7479]: I0308 00:21:58.033484 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l9pkr" event={"ID":"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e","Type":"ContainerStarted","Data":"f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d"} Mar 08 00:21:58.045814 master-0 kubenswrapper[7479]: I0308 00:21:58.045778 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:58.051507 master-0 kubenswrapper[7479]: I0308 00:21:58.051480 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 00:21:58.064280 master-0 kubenswrapper[7479]: I0308 00:21:58.061504 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l9pkr" podStartSLOduration=1.061494278 podStartE2EDuration="1.061494278s" podCreationTimestamp="2026-03-08 00:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:58.061001312 +0000 UTC m=+34.373910229" watchObservedRunningTime="2026-03-08 00:21:58.061494278 +0000 UTC m=+34.374403215" Mar 08 00:21:59.896320 master-0 kubenswrapper[7479]: I0308 00:21:59.895450 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd9aee54-e935-4841-b135-9000e006b96e" path="/var/lib/kubelet/pods/cd9aee54-e935-4841-b135-9000e006b96e/volumes" Mar 08 00:21:59.917562 master-0 kubenswrapper[7479]: I0308 00:21:59.917492 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:21:59.917881 master-0 kubenswrapper[7479]: I0308 00:21:59.917679 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="c1157264-0054-491e-bc65-daf626fc041a" containerName="installer" containerID="cri-o://09ab7ef94ac35a92c2c10eadb681e5e8b177bf4cd4b8eb49914c79b43cd59144" gracePeriod=30 Mar 08 00:22:02.849417 master-0 kubenswrapper[7479]: I0308 00:22:02.849323 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:02.849977 master-0 kubenswrapper[7479]: E0308 00:22:02.849526 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9aee54-e935-4841-b135-9000e006b96e" containerName="installer" Mar 08 00:22:02.849977 master-0 kubenswrapper[7479]: I0308 00:22:02.849538 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9aee54-e935-4841-b135-9000e006b96e" containerName="installer" Mar 08 00:22:02.849977 master-0 kubenswrapper[7479]: I0308 00:22:02.849598 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9aee54-e935-4841-b135-9000e006b96e" containerName="installer" Mar 08 00:22:02.849977 master-0 kubenswrapper[7479]: I0308 00:22:02.849866 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.895275 master-0 kubenswrapper[7479]: I0308 00:22:02.892968 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.895275 master-0 kubenswrapper[7479]: I0308 00:22:02.893017 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.895275 master-0 kubenswrapper[7479]: I0308 00:22:02.893050 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.932313 master-0 kubenswrapper[7479]: I0308 00:22:02.932264 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:02.993682 master-0 kubenswrapper[7479]: I0308 00:22:02.993606 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.993682 master-0 kubenswrapper[7479]: I0308 00:22:02.993653 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.993682 master-0 kubenswrapper[7479]: I0308 00:22:02.993704 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.993952 master-0 kubenswrapper[7479]: I0308 00:22:02.993745 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:02.993952 master-0 kubenswrapper[7479]: I0308 00:22:02.993809 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:03.123656 master-0 kubenswrapper[7479]: I0308 00:22:03.122799 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs"] Mar 08 00:22:03.123656 master-0 kubenswrapper[7479]: I0308 00:22:03.123555 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.125453 master-0 kubenswrapper[7479]: I0308 00:22:03.125414 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 00:22:03.126915 master-0 kubenswrapper[7479]: I0308 00:22:03.126880 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 00:22:03.130613 master-0 kubenswrapper[7479]: I0308 00:22:03.130576 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 00:22:03.196594 master-0 kubenswrapper[7479]: I0308 00:22:03.196537 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.196758 master-0 kubenswrapper[7479]: I0308 00:22:03.196600 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2h6\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.196758 master-0 kubenswrapper[7479]: I0308 00:22:03.196666 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.196758 master-0 kubenswrapper[7479]: I0308 00:22:03.196693 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.196886 master-0 kubenswrapper[7479]: I0308 00:22:03.196758 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.304906 master-0 kubenswrapper[7479]: I0308 00:22:03.304848 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2h6\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305134 master-0 kubenswrapper[7479]: I0308 00:22:03.304918 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305134 master-0 kubenswrapper[7479]: I0308 00:22:03.304941 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305134 master-0 kubenswrapper[7479]: I0308 00:22:03.304958 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305134 master-0 kubenswrapper[7479]: I0308 00:22:03.304998 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305406 master-0 kubenswrapper[7479]: I0308 00:22:03.305140 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.305809 master-0 kubenswrapper[7479]: I0308 00:22:03.305646 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.306122 master-0 kubenswrapper[7479]: I0308 00:22:03.305932 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.309932 master-0 kubenswrapper[7479]: I0308 00:22:03.309564 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs"] Mar 08 00:22:03.311365 master-0 kubenswrapper[7479]: I0308 00:22:03.310864 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.402086 master-0 kubenswrapper[7479]: I0308 00:22:03.401699 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q"] Mar 08 00:22:03.404238 master-0 kubenswrapper[7479]: I0308 00:22:03.402714 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:03.404238 master-0 kubenswrapper[7479]: I0308 00:22:03.403225 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.405521 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406764 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406800 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406831 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406860 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406890 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.406919 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9pm\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.408224 master-0 kubenswrapper[7479]: I0308 00:22:03.407974 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 00:22:03.408608 master-0 kubenswrapper[7479]: I0308 00:22:03.408498 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 00:22:03.419187 master-0 kubenswrapper[7479]: I0308 00:22:03.410927 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q"] Mar 08 00:22:03.419187 master-0 kubenswrapper[7479]: I0308 00:22:03.412027 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2h6\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.422103 master-0 kubenswrapper[7479]: I0308 00:22:03.422071 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 00:22:03.437649 master-0 kubenswrapper[7479]: I0308 00:22:03.437609 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:03.485222 master-0 kubenswrapper[7479]: I0308 00:22:03.485137 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:03.508371 master-0 kubenswrapper[7479]: I0308 00:22:03.508314 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508371 master-0 kubenswrapper[7479]: I0308 00:22:03.508372 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508575 master-0 kubenswrapper[7479]: I0308 00:22:03.508458 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508575 master-0 kubenswrapper[7479]: I0308 00:22:03.508527 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508575 master-0 kubenswrapper[7479]: I0308 00:22:03.508552 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508575 master-0 kubenswrapper[7479]: I0308 00:22:03.508572 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508676 master-0 kubenswrapper[7479]: I0308 00:22:03.508588 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9pm\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.508941 master-0 kubenswrapper[7479]: I0308 00:22:03.508924 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.509278 master-0 kubenswrapper[7479]: I0308 00:22:03.509259 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.515024 master-0 kubenswrapper[7479]: I0308 00:22:03.514986 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.515891 master-0 kubenswrapper[7479]: I0308 00:22:03.515858 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.523764 master-0 kubenswrapper[7479]: I0308 00:22:03.523721 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9pm\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:03.762211 master-0 kubenswrapper[7479]: I0308 00:22:03.762171 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:06.051766 master-0 kubenswrapper[7479]: I0308 00:22:06.051726 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:22:06.052430 master-0 kubenswrapper[7479]: I0308 00:22:06.052342 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.055058 master-0 kubenswrapper[7479]: I0308 00:22:06.054992 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:22:06.063426 master-0 kubenswrapper[7479]: I0308 00:22:06.063365 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:22:06.078415 master-0 kubenswrapper[7479]: I0308 00:22:06.078379 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_c1157264-0054-491e-bc65-daf626fc041a/installer/0.log" Mar 08 00:22:06.078555 master-0 kubenswrapper[7479]: I0308 00:22:06.078421 7479 generic.go:334] "Generic (PLEG): container finished" podID="c1157264-0054-491e-bc65-daf626fc041a" containerID="09ab7ef94ac35a92c2c10eadb681e5e8b177bf4cd4b8eb49914c79b43cd59144" exitCode=1 Mar 08 00:22:06.078555 master-0 kubenswrapper[7479]: I0308 00:22:06.078447 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"c1157264-0054-491e-bc65-daf626fc041a","Type":"ContainerDied","Data":"09ab7ef94ac35a92c2c10eadb681e5e8b177bf4cd4b8eb49914c79b43cd59144"} Mar 08 00:22:06.243262 master-0 kubenswrapper[7479]: I0308 00:22:06.243220 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.243435 master-0 kubenswrapper[7479]: I0308 00:22:06.243279 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.243435 master-0 kubenswrapper[7479]: I0308 00:22:06.243350 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.344346 master-0 kubenswrapper[7479]: I0308 00:22:06.344230 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.344346 master-0 kubenswrapper[7479]: I0308 00:22:06.344292 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.344564 master-0 kubenswrapper[7479]: I0308 00:22:06.344441 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.344564 master-0 kubenswrapper[7479]: I0308 00:22:06.344518 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.344564 master-0 kubenswrapper[7479]: I0308 00:22:06.344555 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.363799 master-0 kubenswrapper[7479]: I0308 00:22:06.363699 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.381103 master-0 kubenswrapper[7479]: I0308 00:22:06.378470 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:06.829925 master-0 kubenswrapper[7479]: I0308 00:22:06.829666 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:22:08.678321 master-0 kubenswrapper[7479]: I0308 00:22:08.676124 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 00:22:08.678321 master-0 kubenswrapper[7479]: I0308 00:22:08.676738 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.680533 master-0 kubenswrapper[7479]: I0308 00:22:08.680508 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:22:08.690600 master-0 kubenswrapper[7479]: I0308 00:22:08.690554 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 00:22:08.875110 master-0 kubenswrapper[7479]: I0308 00:22:08.875050 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.875110 master-0 kubenswrapper[7479]: I0308 00:22:08.875116 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.875390 master-0 kubenswrapper[7479]: I0308 00:22:08.875139 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.976227 master-0 kubenswrapper[7479]: I0308 00:22:08.976093 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.976227 master-0 kubenswrapper[7479]: I0308 00:22:08.976143 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.976459 master-0 kubenswrapper[7479]: I0308 00:22:08.976246 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.976459 master-0 kubenswrapper[7479]: I0308 00:22:08.976337 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:08.976521 master-0 kubenswrapper[7479]: I0308 00:22:08.976493 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:09.020918 master-0 kubenswrapper[7479]: I0308 00:22:09.020862 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access\") pod \"installer-1-master-0\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:09.296861 master-0 kubenswrapper[7479]: I0308 00:22:09.296810 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:09.940818 master-0 kubenswrapper[7479]: I0308 00:22:09.940709 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:10.501621 master-0 kubenswrapper[7479]: I0308 00:22:10.501556 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq"] Mar 08 00:22:10.502006 master-0 kubenswrapper[7479]: I0308 00:22:10.501900 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" podUID="32c19760-2cb2-4690-be8e-cba3c517c60e" containerName="cluster-version-operator" containerID="cri-o://34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876" gracePeriod=130 Mar 08 00:22:10.741399 master-0 kubenswrapper[7479]: I0308 00:22:10.740985 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_c1157264-0054-491e-bc65-daf626fc041a/installer/0.log" Mar 08 00:22:10.741399 master-0 kubenswrapper[7479]: I0308 00:22:10.741055 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:22:10.906996 master-0 kubenswrapper[7479]: I0308 00:22:10.899282 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir\") pod \"c1157264-0054-491e-bc65-daf626fc041a\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " Mar 08 00:22:10.906996 master-0 kubenswrapper[7479]: I0308 00:22:10.899363 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access\") pod \"c1157264-0054-491e-bc65-daf626fc041a\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " Mar 08 00:22:10.906996 master-0 kubenswrapper[7479]: I0308 00:22:10.899436 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock\") pod \"c1157264-0054-491e-bc65-daf626fc041a\" (UID: \"c1157264-0054-491e-bc65-daf626fc041a\") " Mar 08 00:22:10.906996 master-0 kubenswrapper[7479]: I0308 00:22:10.899612 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock" (OuterVolumeSpecName: "var-lock") pod "c1157264-0054-491e-bc65-daf626fc041a" (UID: "c1157264-0054-491e-bc65-daf626fc041a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:10.906996 master-0 kubenswrapper[7479]: I0308 00:22:10.899639 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c1157264-0054-491e-bc65-daf626fc041a" (UID: "c1157264-0054-491e-bc65-daf626fc041a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:10.938368 master-0 kubenswrapper[7479]: I0308 00:22:10.937815 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c1157264-0054-491e-bc65-daf626fc041a" (UID: "c1157264-0054-491e-bc65-daf626fc041a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:11.000834 master-0 kubenswrapper[7479]: I0308 00:22:11.000805 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.001159 master-0 kubenswrapper[7479]: I0308 00:22:11.000834 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1157264-0054-491e-bc65-daf626fc041a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.001159 master-0 kubenswrapper[7479]: I0308 00:22:11.000846 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1157264-0054-491e-bc65-daf626fc041a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.116915 master-0 kubenswrapper[7479]: I0308 00:22:11.116871 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3"} Mar 08 00:22:11.117321 master-0 kubenswrapper[7479]: I0308 00:22:11.117290 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:22:11.121261 master-0 kubenswrapper[7479]: I0308 00:22:11.121231 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:22:11.121320 master-0 kubenswrapper[7479]: I0308 00:22:11.121269 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:22:11.122653 master-0 kubenswrapper[7479]: I0308 00:22:11.122610 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerStarted","Data":"829e088d3beb6bbaa940412e9e43d8b3ba4f7b2b62947bd685d43db99e68005b"} Mar 08 00:22:11.129180 master-0 kubenswrapper[7479]: I0308 00:22:11.129155 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_c1157264-0054-491e-bc65-daf626fc041a/installer/0.log" Mar 08 00:22:11.129575 master-0 kubenswrapper[7479]: I0308 00:22:11.129287 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 00:22:11.129575 master-0 kubenswrapper[7479]: I0308 00:22:11.129300 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"c1157264-0054-491e-bc65-daf626fc041a","Type":"ContainerDied","Data":"ed2624edd1b0bdd5361a9bb140b7d43812ceeacf4dbac6ce6049853d2e2f0be3"} Mar 08 00:22:11.129575 master-0 kubenswrapper[7479]: I0308 00:22:11.129358 7479 scope.go:117] "RemoveContainer" containerID="09ab7ef94ac35a92c2c10eadb681e5e8b177bf4cd4b8eb49914c79b43cd59144" Mar 08 00:22:11.132546 master-0 kubenswrapper[7479]: I0308 00:22:11.132517 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" event={"ID":"426e2fcf-dfb6-4193-91ae-c6daef6e50b1","Type":"ContainerStarted","Data":"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2"} Mar 08 00:22:11.133129 master-0 kubenswrapper[7479]: I0308 00:22:11.133100 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerName="route-controller-manager" containerID="cri-o://88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2" gracePeriod=30 Mar 08 00:22:11.133761 master-0 kubenswrapper[7479]: I0308 00:22:11.133412 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:22:11.135101 master-0 kubenswrapper[7479]: I0308 00:22:11.135060 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs"] Mar 08 00:22:11.137680 master-0 kubenswrapper[7479]: I0308 00:22:11.137653 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" event={"ID":"6d770808-d390-41c1-a9d9-fc12b99fa9a9","Type":"ContainerStarted","Data":"14725fd0b5b18b46ce9bdb373030cfe8e6d0b6e93e752dd6c68eaa4f70173138"} Mar 08 00:22:11.178921 master-0 kubenswrapper[7479]: I0308 00:22:11.178850 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" podStartSLOduration=4.91245188 podStartE2EDuration="19.178830514s" podCreationTimestamp="2026-03-08 00:21:52 +0000 UTC" firstStartedPulling="2026-03-08 00:21:56.445535707 +0000 UTC m=+32.758444624" lastFinishedPulling="2026-03-08 00:22:10.711914341 +0000 UTC m=+47.024823258" observedRunningTime="2026-03-08 00:22:11.173430547 +0000 UTC m=+47.486339464" watchObservedRunningTime="2026-03-08 00:22:11.178830514 +0000 UTC m=+47.491739431" Mar 08 00:22:11.202694 master-0 kubenswrapper[7479]: W0308 00:22:11.202640 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb8fea7_71ca_43a3_839d_9c1459bf8dfa.slice/crio-773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183 WatchSource:0}: Error finding container 773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183: Status 404 returned error can't find the container with id 773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183 Mar 08 00:22:11.298740 master-0 kubenswrapper[7479]: I0308 00:22:11.298187 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q"] Mar 08 00:22:11.320332 master-0 kubenswrapper[7479]: I0308 00:22:11.318799 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:22:11.324316 master-0 kubenswrapper[7479]: I0308 00:22:11.324272 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:11.328514 master-0 kubenswrapper[7479]: I0308 00:22:11.328478 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:22:11.340409 master-0 kubenswrapper[7479]: I0308 00:22:11.339994 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 00:22:11.358171 master-0 kubenswrapper[7479]: I0308 00:22:11.355586 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409690 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") pod \"32c19760-2cb2-4690-be8e-cba3c517c60e\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409744 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") pod \"32c19760-2cb2-4690-be8e-cba3c517c60e\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409764 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") pod \"32c19760-2cb2-4690-be8e-cba3c517c60e\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409797 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") pod \"32c19760-2cb2-4690-be8e-cba3c517c60e\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409816 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") pod \"32c19760-2cb2-4690-be8e-cba3c517c60e\" (UID: \"32c19760-2cb2-4690-be8e-cba3c517c60e\") " Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.409947 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "32c19760-2cb2-4690-be8e-cba3c517c60e" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.411477 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca" (OuterVolumeSpecName: "service-ca") pod "32c19760-2cb2-4690-be8e-cba3c517c60e" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:11.415245 master-0 kubenswrapper[7479]: I0308 00:22:11.411534 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "32c19760-2cb2-4690-be8e-cba3c517c60e" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:11.427183 master-0 kubenswrapper[7479]: I0308 00:22:11.427120 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 00:22:11.436098 master-0 kubenswrapper[7479]: W0308 00:22:11.435955 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod55216a56_677a_4f28_a530_77d44bded8a2.slice/crio-2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7 WatchSource:0}: Error finding container 2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7: Status 404 returned error can't find the container with id 2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7 Mar 08 00:22:11.444396 master-0 kubenswrapper[7479]: I0308 00:22:11.444355 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32c19760-2cb2-4690-be8e-cba3c517c60e" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:22:11.444638 master-0 kubenswrapper[7479]: I0308 00:22:11.444589 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "32c19760-2cb2-4690-be8e-cba3c517c60e" (UID: "32c19760-2cb2-4690-be8e-cba3c517c60e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:11.489021 master-0 kubenswrapper[7479]: I0308 00:22:11.488418 7479 patch_prober.go:28] interesting pod/route-controller-manager-5d7d75cbb9-lf8cw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.41:8443/healthz\": read tcp 10.128.0.2:59716->10.128.0.41:8443: read: connection reset by peer" start-of-body= Mar 08 00:22:11.489021 master-0 kubenswrapper[7479]: I0308 00:22:11.488466 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.41:8443/healthz\": read tcp 10.128.0.2:59716->10.128.0.41:8443: read: connection reset by peer" Mar 08 00:22:11.511324 master-0 kubenswrapper[7479]: I0308 00:22:11.511286 7479 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32c19760-2cb2-4690-be8e-cba3c517c60e-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.511324 master-0 kubenswrapper[7479]: I0308 00:22:11.511319 7479 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.511324 master-0 kubenswrapper[7479]: I0308 00:22:11.511330 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c19760-2cb2-4690-be8e-cba3c517c60e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.511525 master-0 kubenswrapper[7479]: I0308 00:22:11.511340 7479 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32c19760-2cb2-4690-be8e-cba3c517c60e-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.511525 master-0 kubenswrapper[7479]: I0308 00:22:11.511351 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c19760-2cb2-4690-be8e-cba3c517c60e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:11.935227 master-0 kubenswrapper[7479]: I0308 00:22:11.934438 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1157264-0054-491e-bc65-daf626fc041a" path="/var/lib/kubelet/pods/c1157264-0054-491e-bc65-daf626fc041a/volumes" Mar 08 00:22:11.953527 master-0 kubenswrapper[7479]: I0308 00:22:11.943859 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5d7d75cbb9-lf8cw_426e2fcf-dfb6-4193-91ae-c6daef6e50b1/route-controller-manager/0.log" Mar 08 00:22:11.953527 master-0 kubenswrapper[7479]: I0308 00:22:11.943926 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:22:12.126378 master-0 kubenswrapper[7479]: I0308 00:22:12.126336 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config\") pod \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " Mar 08 00:22:12.126782 master-0 kubenswrapper[7479]: I0308 00:22:12.126399 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca\") pod \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " Mar 08 00:22:12.126782 master-0 kubenswrapper[7479]: I0308 00:22:12.126432 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert\") pod \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " Mar 08 00:22:12.126782 master-0 kubenswrapper[7479]: I0308 00:22:12.126457 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqkjm\" (UniqueName: \"kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm\") pod \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\" (UID: \"426e2fcf-dfb6-4193-91ae-c6daef6e50b1\") " Mar 08 00:22:12.129001 master-0 kubenswrapper[7479]: I0308 00:22:12.127279 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config" (OuterVolumeSpecName: "config") pod "426e2fcf-dfb6-4193-91ae-c6daef6e50b1" (UID: "426e2fcf-dfb6-4193-91ae-c6daef6e50b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:12.129001 master-0 kubenswrapper[7479]: I0308 00:22:12.127294 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca" (OuterVolumeSpecName: "client-ca") pod "426e2fcf-dfb6-4193-91ae-c6daef6e50b1" (UID: "426e2fcf-dfb6-4193-91ae-c6daef6e50b1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:12.143274 master-0 kubenswrapper[7479]: I0308 00:22:12.142607 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm" (OuterVolumeSpecName: "kube-api-access-zqkjm") pod "426e2fcf-dfb6-4193-91ae-c6daef6e50b1" (UID: "426e2fcf-dfb6-4193-91ae-c6daef6e50b1"). InnerVolumeSpecName "kube-api-access-zqkjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:12.157833 master-0 kubenswrapper[7479]: I0308 00:22:12.157674 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "426e2fcf-dfb6-4193-91ae-c6daef6e50b1" (UID: "426e2fcf-dfb6-4193-91ae-c6daef6e50b1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:22:12.175863 master-0 kubenswrapper[7479]: I0308 00:22:12.175819 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"ef1557fdf295530164fddc6e32be204cb91e899b1392304c5810a0afd29e77ff"} Mar 08 00:22:12.175961 master-0 kubenswrapper[7479]: I0308 00:22:12.175920 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"bb8dfd749824585a5971cc6ceb0409c06052a233c71d6156a9b5d20725022dcf"} Mar 08 00:22:12.179925 master-0 kubenswrapper[7479]: I0308 00:22:12.179888 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4","Type":"ContainerStarted","Data":"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61"} Mar 08 00:22:12.180000 master-0 kubenswrapper[7479]: I0308 00:22:12.179931 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4","Type":"ContainerStarted","Data":"87f0590efacfbdd11880d4aaab66640abded535b4b553b6b3da74e7ad35cedda"} Mar 08 00:22:12.180080 master-0 kubenswrapper[7479]: I0308 00:22:12.180043 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" containerName="installer" containerID="cri-o://1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61" gracePeriod=30 Mar 08 00:22:12.182828 master-0 kubenswrapper[7479]: I0308 00:22:12.182797 7479 generic.go:334] "Generic (PLEG): container finished" podID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerID="8e5eb8c3a997190fe55fe0f74af3ee5e0a5480af9438a723ead360bc861186ec" exitCode=0 Mar 08 00:22:12.182881 master-0 kubenswrapper[7479]: I0308 00:22:12.182863 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerDied","Data":"8e5eb8c3a997190fe55fe0f74af3ee5e0a5480af9438a723ead360bc861186ec"} Mar 08 00:22:12.185818 master-0 kubenswrapper[7479]: I0308 00:22:12.185786 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" event={"ID":"b94acad3-cf4e-443d-80fb-5e68a4074336","Type":"ContainerStarted","Data":"95fc7c4c4a487643b9831f1cedf5dda283cc70c5afdd39d20b4d5ea8bc0108bd"} Mar 08 00:22:12.190155 master-0 kubenswrapper[7479]: I0308 00:22:12.190083 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:22:12.194885 master-0 kubenswrapper[7479]: I0308 00:22:12.194852 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869"} Mar 08 00:22:12.194957 master-0 kubenswrapper[7479]: I0308 00:22:12.194890 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183"} Mar 08 00:22:12.196370 master-0 kubenswrapper[7479]: I0308 00:22:12.196291 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:22:12.205247 master-0 kubenswrapper[7479]: I0308 00:22:12.205214 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" event={"ID":"5837befc-f6e9-4f74-ae39-d0aec977f0c9","Type":"ContainerStarted","Data":"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66"} Mar 08 00:22:12.208484 master-0 kubenswrapper[7479]: I0308 00:22:12.208447 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" podUID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" containerName="controller-manager" containerID="cri-o://9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66" gracePeriod=30 Mar 08 00:22:12.208838 master-0 kubenswrapper[7479]: I0308 00:22:12.208809 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:22:12.217184 master-0 kubenswrapper[7479]: I0308 00:22:12.213295 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=10.213279377 podStartE2EDuration="10.213279377s" podCreationTimestamp="2026-03-08 00:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:12.211761028 +0000 UTC m=+48.524669945" watchObservedRunningTime="2026-03-08 00:22:12.213279377 +0000 UTC m=+48.526188294" Mar 08 00:22:12.217184 master-0 kubenswrapper[7479]: I0308 00:22:12.215054 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:22:12.220613 master-0 kubenswrapper[7479]: I0308 00:22:12.220563 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" event={"ID":"4ad37f40-c533-4a1e-882a-2e0973eff86d","Type":"ContainerStarted","Data":"fec761ba111693d32c9163242c81a699413cc2198220381020f06b4d5f0d4c4e"} Mar 08 00:22:12.220613 master-0 kubenswrapper[7479]: I0308 00:22:12.220601 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:22:12.222753 master-0 kubenswrapper[7479]: I0308 00:22:12.222705 7479 generic.go:334] "Generic (PLEG): container finished" podID="32c19760-2cb2-4690-be8e-cba3c517c60e" containerID="34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876" exitCode=0 Mar 08 00:22:12.223365 master-0 kubenswrapper[7479]: I0308 00:22:12.222856 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" Mar 08 00:22:12.223365 master-0 kubenswrapper[7479]: I0308 00:22:12.223358 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" event={"ID":"32c19760-2cb2-4690-be8e-cba3c517c60e","Type":"ContainerDied","Data":"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876"} Mar 08 00:22:12.223474 master-0 kubenswrapper[7479]: I0308 00:22:12.223384 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq" event={"ID":"32c19760-2cb2-4690-be8e-cba3c517c60e","Type":"ContainerDied","Data":"cbe80ab488a27b71936b88f11fbebbeb1bad4f97f15ed93df41d4a1b48940bdd"} Mar 08 00:22:12.223474 master-0 kubenswrapper[7479]: I0308 00:22:12.223401 7479 scope.go:117] "RemoveContainer" containerID="34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.226577 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.229231 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" podStartSLOduration=16.12141898 podStartE2EDuration="31.229193549s" podCreationTimestamp="2026-03-08 00:21:41 +0000 UTC" firstStartedPulling="2026-03-08 00:21:55.668881284 +0000 UTC m=+31.981790201" lastFinishedPulling="2026-03-08 00:22:10.776655843 +0000 UTC m=+47.089564770" observedRunningTime="2026-03-08 00:22:12.226904764 +0000 UTC m=+48.539813681" watchObservedRunningTime="2026-03-08 00:22:12.229193549 +0000 UTC m=+48.542102476" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.230401 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.230427 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.230440 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqkjm\" (UniqueName: \"kubernetes.io/projected/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-kube-api-access-zqkjm\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.231576 master-0 kubenswrapper[7479]: I0308 00:22:12.230450 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426e2fcf-dfb6-4193-91ae-c6daef6e50b1-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.244469 master-0 kubenswrapper[7479]: I0308 00:22:12.241951 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"ada20442-bff5-477c-989e-3d921f5ede5e","Type":"ContainerStarted","Data":"15d74f4f21139026e1f17a65ff4323887705b24a5c623f5887aaa69f1485ac9b"} Mar 08 00:22:12.244469 master-0 kubenswrapper[7479]: I0308 00:22:12.241998 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"ada20442-bff5-477c-989e-3d921f5ede5e","Type":"ContainerStarted","Data":"690784e0df6abe7f6bb7d7b1b2637aaaba8b482bcfbe8880715b7a6a2d707f93"} Mar 08 00:22:12.249743 master-0 kubenswrapper[7479]: I0308 00:22:12.248464 7479 scope.go:117] "RemoveContainer" containerID="34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876" Mar 08 00:22:12.249743 master-0 kubenswrapper[7479]: E0308 00:22:12.248841 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876\": container with ID starting with 34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876 not found: ID does not exist" containerID="34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876" Mar 08 00:22:12.249743 master-0 kubenswrapper[7479]: I0308 00:22:12.248885 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876"} err="failed to get container status \"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876\": rpc error: code = NotFound desc = could not find container \"34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876\": container with ID starting with 34b1b37983d46a115f3d0ba8af513faf4d5ef94e2619ccb2e671c100b6fff876 not found: ID does not exist" Mar 08 00:22:12.249743 master-0 kubenswrapper[7479]: I0308 00:22:12.249486 7479 generic.go:334] "Generic (PLEG): container finished" podID="531e9339-968c-47bf-b8ea-c44d9ceef4b3" containerID="829e088d3beb6bbaa940412e9e43d8b3ba4f7b2b62947bd685d43db99e68005b" exitCode=0 Mar 08 00:22:12.249743 master-0 kubenswrapper[7479]: I0308 00:22:12.249528 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerDied","Data":"829e088d3beb6bbaa940412e9e43d8b3ba4f7b2b62947bd685d43db99e68005b"} Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.276705 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5d7d75cbb9-lf8cw_426e2fcf-dfb6-4193-91ae-c6daef6e50b1/route-controller-manager/0.log" Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.276766 7479 generic.go:334] "Generic (PLEG): container finished" podID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerID="88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2" exitCode=255 Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.276823 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" event={"ID":"426e2fcf-dfb6-4193-91ae-c6daef6e50b1","Type":"ContainerDied","Data":"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2"} Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.276853 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" event={"ID":"426e2fcf-dfb6-4193-91ae-c6daef6e50b1","Type":"ContainerDied","Data":"1820ddcea1c4ba97ee92f4393008b3454d248224a8b8de608f40514e7782286d"} Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.276869 7479 scope.go:117] "RemoveContainer" containerID="88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2" Mar 08 00:22:12.278113 master-0 kubenswrapper[7479]: I0308 00:22:12.277230 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw" Mar 08 00:22:12.301664 master-0 kubenswrapper[7479]: I0308 00:22:12.301420 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"044eaa58832f79354645bb27892aef22a346fbf4bfe737dea79901ffa64d2090"} Mar 08 00:22:12.319904 master-0 kubenswrapper[7479]: I0308 00:22:12.319862 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"52be315580333e096a5c394dfb3b50ff79852b6010007ad83ccb2074b85db43b"} Mar 08 00:22:12.319994 master-0 kubenswrapper[7479]: I0308 00:22:12.319910 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"f4026f3f82c087e6f1133285b0314080fd77636b3f28a79b0a59695dc64ab709"} Mar 08 00:22:12.320510 master-0 kubenswrapper[7479]: I0308 00:22:12.320490 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:22:12.332797 master-0 kubenswrapper[7479]: I0308 00:22:12.330456 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerStarted","Data":"c54ec75e7b215135d97163ba8f315624435a019aae1bb5d4becc779b33de3782"} Mar 08 00:22:12.341668 master-0 kubenswrapper[7479]: I0308 00:22:12.341513 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"7b9f0eb1c41cef5d8230e9e1038d90bce9d1d6ac13eb84abd28591cfa2cf66a5"} Mar 08 00:22:12.342302 master-0 kubenswrapper[7479]: I0308 00:22:12.341796 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:22:12.349726 master-0 kubenswrapper[7479]: I0308 00:22:12.346549 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerStarted","Data":"1a0afc6f5f43ae0c03dad4b66580da08dbfc175218d88b6ca2b45fa8794895ad"} Mar 08 00:22:12.349726 master-0 kubenswrapper[7479]: I0308 00:22:12.346581 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerStarted","Data":"2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7"} Mar 08 00:22:12.380342 master-0 kubenswrapper[7479]: I0308 00:22:12.376681 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:22:12.383042 master-0 kubenswrapper[7479]: I0308 00:22:12.382672 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=6.382660839 podStartE2EDuration="6.382660839s" podCreationTimestamp="2026-03-08 00:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:12.377003703 +0000 UTC m=+48.689912610" watchObservedRunningTime="2026-03-08 00:22:12.382660839 +0000 UTC m=+48.695569756" Mar 08 00:22:12.417455 master-0 kubenswrapper[7479]: I0308 00:22:12.413578 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:22:12.417455 master-0 kubenswrapper[7479]: I0308 00:22:12.414008 7479 scope.go:117] "RemoveContainer" containerID="88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2" Mar 08 00:22:12.417455 master-0 kubenswrapper[7479]: I0308 00:22:12.415799 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7d75cbb9-lf8cw"] Mar 08 00:22:12.417455 master-0 kubenswrapper[7479]: E0308 00:22:12.415861 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2\": container with ID starting with 88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2 not found: ID does not exist" containerID="88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2" Mar 08 00:22:12.417455 master-0 kubenswrapper[7479]: I0308 00:22:12.415885 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2"} err="failed to get container status \"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2\": rpc error: code = NotFound desc = could not find container \"88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2\": container with ID starting with 88b0965f951a8a52564a30eba8c14747ed138a014fd7b9568fb87178f15d41e2 not found: ID does not exist" Mar 08 00:22:12.433169 master-0 kubenswrapper[7479]: I0308 00:22:12.433104 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jfjzg" podStartSLOduration=3.5735086640000002 podStartE2EDuration="16.433089401s" podCreationTimestamp="2026-03-08 00:21:56 +0000 UTC" firstStartedPulling="2026-03-08 00:21:57.927762896 +0000 UTC m=+34.240671813" lastFinishedPulling="2026-03-08 00:22:10.787343633 +0000 UTC m=+47.100252550" observedRunningTime="2026-03-08 00:22:12.43274889 +0000 UTC m=+48.745657807" watchObservedRunningTime="2026-03-08 00:22:12.433089401 +0000 UTC m=+48.745998318" Mar 08 00:22:12.496182 master-0 kubenswrapper[7479]: I0308 00:22:12.493798 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=4.4937807 podStartE2EDuration="4.4937807s" podCreationTimestamp="2026-03-08 00:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:12.490180063 +0000 UTC m=+48.803088980" watchObservedRunningTime="2026-03-08 00:22:12.4937807 +0000 UTC m=+48.806689617" Mar 08 00:22:12.557236 master-0 kubenswrapper[7479]: I0308 00:22:12.557181 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq"] Mar 08 00:22:12.572525 master-0 kubenswrapper[7479]: I0308 00:22:12.572415 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-dcbvq"] Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634509 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj"] Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: E0308 00:22:12.634730 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1157264-0054-491e-bc65-daf626fc041a" containerName="installer" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634747 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1157264-0054-491e-bc65-daf626fc041a" containerName="installer" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: E0308 00:22:12.634761 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c19760-2cb2-4690-be8e-cba3c517c60e" containerName="cluster-version-operator" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634769 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c19760-2cb2-4690-be8e-cba3c517c60e" containerName="cluster-version-operator" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: E0308 00:22:12.634779 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerName="route-controller-manager" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634789 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerName="route-controller-manager" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634893 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c19760-2cb2-4690-be8e-cba3c517c60e" containerName="cluster-version-operator" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634909 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" containerName="route-controller-manager" Mar 08 00:22:12.637275 master-0 kubenswrapper[7479]: I0308 00:22:12.634920 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1157264-0054-491e-bc65-daf626fc041a" containerName="installer" Mar 08 00:22:12.642006 master-0 kubenswrapper[7479]: I0308 00:22:12.641633 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:22:12.645023 master-0 kubenswrapper[7479]: I0308 00:22:12.642553 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:22:12.645023 master-0 kubenswrapper[7479]: I0308 00:22:12.643484 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.645023 master-0 kubenswrapper[7479]: I0308 00:22:12.643961 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.645235 master-0 kubenswrapper[7479]: I0308 00:22:12.645114 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.648026 master-0 kubenswrapper[7479]: I0308 00:22:12.647999 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:22:12.648245 master-0 kubenswrapper[7479]: I0308 00:22:12.648213 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:22:12.648380 master-0 kubenswrapper[7479]: I0308 00:22:12.648362 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:22:12.659463 master-0 kubenswrapper[7479]: I0308 00:22:12.652758 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:22:12.659463 master-0 kubenswrapper[7479]: I0308 00:22:12.658529 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736628 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlmg8\" (UniqueName: \"kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736665 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736684 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736709 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736735 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7rj\" (UniqueName: \"kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736757 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736783 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736803 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736819 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736835 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.737402 master-0 kubenswrapper[7479]: I0308 00:22:12.736852 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.763882 master-0 kubenswrapper[7479]: I0308 00:22:12.763837 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:22:12.765068 master-0 kubenswrapper[7479]: I0308 00:22:12.765046 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_bb5babcb-5fef-44dc-b9e1-87e09a2b31c4/installer/0.log" Mar 08 00:22:12.765127 master-0 kubenswrapper[7479]: I0308 00:22:12.765089 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:12.810325 master-0 kubenswrapper[7479]: I0308 00:22:12.810284 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:22:12.810481 master-0 kubenswrapper[7479]: E0308 00:22:12.810437 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" containerName="controller-manager" Mar 08 00:22:12.810481 master-0 kubenswrapper[7479]: I0308 00:22:12.810447 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" containerName="controller-manager" Mar 08 00:22:12.810481 master-0 kubenswrapper[7479]: E0308 00:22:12.810458 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" containerName="installer" Mar 08 00:22:12.810481 master-0 kubenswrapper[7479]: I0308 00:22:12.810464 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" containerName="installer" Mar 08 00:22:12.810606 master-0 kubenswrapper[7479]: I0308 00:22:12.810536 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" containerName="installer" Mar 08 00:22:12.810606 master-0 kubenswrapper[7479]: I0308 00:22:12.810550 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" containerName="controller-manager" Mar 08 00:22:12.810785 master-0 kubenswrapper[7479]: I0308 00:22:12.810772 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:22:12.811227 master-0 kubenswrapper[7479]: I0308 00:22:12.811211 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:12.811536 master-0 kubenswrapper[7479]: I0308 00:22:12.811511 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.814003 master-0 kubenswrapper[7479]: I0308 00:22:12.813984 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:22:12.814138 master-0 kubenswrapper[7479]: I0308 00:22:12.814124 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:22:12.814391 master-0 kubenswrapper[7479]: I0308 00:22:12.814375 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:22:12.814491 master-0 kubenswrapper[7479]: I0308 00:22:12.814477 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:22:12.814622 master-0 kubenswrapper[7479]: I0308 00:22:12.814583 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:22:12.823705 master-0 kubenswrapper[7479]: I0308 00:22:12.823671 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:22:12.838850 master-0 kubenswrapper[7479]: I0308 00:22:12.838784 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88ts\" (UniqueName: \"kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts\") pod \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " Mar 08 00:22:12.839004 master-0 kubenswrapper[7479]: I0308 00:22:12.838874 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles\") pod \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " Mar 08 00:22:12.839004 master-0 kubenswrapper[7479]: I0308 00:22:12.838932 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config\") pod \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " Mar 08 00:22:12.839004 master-0 kubenswrapper[7479]: I0308 00:22:12.838958 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") pod \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " Mar 08 00:22:12.839004 master-0 kubenswrapper[7479]: I0308 00:22:12.838977 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert\") pod \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\" (UID: \"5837befc-f6e9-4f74-ae39-d0aec977f0c9\") " Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839092 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839116 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839134 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839155 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlmg8\" (UniqueName: \"kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839172 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.839273 master-0 kubenswrapper[7479]: I0308 00:22:12.839188 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.839888 master-0 kubenswrapper[7479]: I0308 00:22:12.839591 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.839888 master-0 kubenswrapper[7479]: I0308 00:22:12.839615 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.840272 master-0 kubenswrapper[7479]: I0308 00:22:12.840249 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.840356 master-0 kubenswrapper[7479]: I0308 00:22:12.840327 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.840590 master-0 kubenswrapper[7479]: I0308 00:22:12.840546 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.840680 master-0 kubenswrapper[7479]: I0308 00:22:12.840657 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.840883 master-0 kubenswrapper[7479]: I0308 00:22:12.839626 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7rj\" (UniqueName: \"kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.841289 master-0 kubenswrapper[7479]: I0308 00:22:12.841256 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5837befc-f6e9-4f74-ae39-d0aec977f0c9" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:12.841354 master-0 kubenswrapper[7479]: I0308 00:22:12.841306 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.841392 master-0 kubenswrapper[7479]: I0308 00:22:12.841377 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.841441 master-0 kubenswrapper[7479]: I0308 00:22:12.841418 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.841482 master-0 kubenswrapper[7479]: I0308 00:22:12.841447 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.841544 master-0 kubenswrapper[7479]: I0308 00:22:12.841475 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config" (OuterVolumeSpecName: "config") pod "5837befc-f6e9-4f74-ae39-d0aec977f0c9" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:12.841544 master-0 kubenswrapper[7479]: I0308 00:22:12.841484 7479 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.841668 master-0 kubenswrapper[7479]: I0308 00:22:12.841645 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.844010 master-0 kubenswrapper[7479]: I0308 00:22:12.843928 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5837befc-f6e9-4f74-ae39-d0aec977f0c9" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:22:12.844949 master-0 kubenswrapper[7479]: I0308 00:22:12.844181 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts" (OuterVolumeSpecName: "kube-api-access-n88ts") pod "5837befc-f6e9-4f74-ae39-d0aec977f0c9" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9"). InnerVolumeSpecName "kube-api-access-n88ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:12.844949 master-0 kubenswrapper[7479]: I0308 00:22:12.844502 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5837befc-f6e9-4f74-ae39-d0aec977f0c9" (UID: "5837befc-f6e9-4f74-ae39-d0aec977f0c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:22:12.850272 master-0 kubenswrapper[7479]: I0308 00:22:12.846071 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.850272 master-0 kubenswrapper[7479]: I0308 00:22:12.848573 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:22:12.862851 master-0 kubenswrapper[7479]: I0308 00:22:12.862824 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:12.862981 master-0 kubenswrapper[7479]: I0308 00:22:12.862957 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlmg8\" (UniqueName: \"kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8\") pod \"certified-operators-lqc4n\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:12.866760 master-0 kubenswrapper[7479]: I0308 00:22:12.864431 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7rj\" (UniqueName: \"kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj\") pod \"community-operators-ms5vp\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.891930 master-0 kubenswrapper[7479]: I0308 00:22:12.890818 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:22:12.891930 master-0 kubenswrapper[7479]: I0308 00:22:12.891660 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:12.902081 master-0 kubenswrapper[7479]: I0308 00:22:12.901079 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:22:12.942762 master-0 kubenswrapper[7479]: I0308 00:22:12.942732 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir\") pod \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " Mar 08 00:22:12.942863 master-0 kubenswrapper[7479]: I0308 00:22:12.942808 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access\") pod \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " Mar 08 00:22:12.942909 master-0 kubenswrapper[7479]: I0308 00:22:12.942868 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock\") pod \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\" (UID: \"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4\") " Mar 08 00:22:12.943029 master-0 kubenswrapper[7479]: I0308 00:22:12.943007 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" (UID: "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:12.943113 master-0 kubenswrapper[7479]: I0308 00:22:12.943090 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:12.943163 master-0 kubenswrapper[7479]: I0308 00:22:12.943139 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.943223 master-0 kubenswrapper[7479]: I0308 00:22:12.943163 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.943223 master-0 kubenswrapper[7479]: I0308 00:22:12.943191 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.943316 master-0 kubenswrapper[7479]: I0308 00:22:12.943231 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.943316 master-0 kubenswrapper[7479]: I0308 00:22:12.943267 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:12.943316 master-0 kubenswrapper[7479]: I0308 00:22:12.943289 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943321 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943347 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943388 7479 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943402 7479 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5837befc-f6e9-4f74-ae39-d0aec977f0c9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943413 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.943430 master-0 kubenswrapper[7479]: I0308 00:22:12.943426 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88ts\" (UniqueName: \"kubernetes.io/projected/5837befc-f6e9-4f74-ae39-d0aec977f0c9-kube-api-access-n88ts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.943634 master-0 kubenswrapper[7479]: I0308 00:22:12.943438 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5837befc-f6e9-4f74-ae39-d0aec977f0c9-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:12.943634 master-0 kubenswrapper[7479]: I0308 00:22:12.943520 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock" (OuterVolumeSpecName: "var-lock") pod "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" (UID: "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:12.946933 master-0 kubenswrapper[7479]: I0308 00:22:12.946883 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" (UID: "bb5babcb-5fef-44dc-b9e1-87e09a2b31c4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:12.965639 master-0 kubenswrapper[7479]: I0308 00:22:12.965041 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:12.985952 master-0 kubenswrapper[7479]: I0308 00:22:12.985923 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:22:13.000034 master-0 kubenswrapper[7479]: W0308 00:22:12.999831 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a229b84_65bd_493b_90dd_b8194f842dc8.slice/crio-f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a WatchSource:0}: Error finding container f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a: Status 404 returned error can't find the container with id f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a Mar 08 00:22:13.044651 master-0 kubenswrapper[7479]: I0308 00:22:13.044606 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044664 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044686 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044704 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044719 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044739 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044757 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044781 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044797 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044817 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.044842 master-0 kubenswrapper[7479]: I0308 00:22:13.044835 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.045098 master-0 kubenswrapper[7479]: I0308 00:22:13.044854 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.045098 master-0 kubenswrapper[7479]: I0308 00:22:13.044888 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:13.045098 master-0 kubenswrapper[7479]: I0308 00:22:13.044899 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:13.045616 master-0 kubenswrapper[7479]: I0308 00:22:13.045596 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.047113 master-0 kubenswrapper[7479]: I0308 00:22:13.046750 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:13.047113 master-0 kubenswrapper[7479]: I0308 00:22:13.047031 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.048951 master-0 kubenswrapper[7479]: I0308 00:22:13.048285 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.049265 master-0 kubenswrapper[7479]: I0308 00:22:13.049098 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.050309 master-0 kubenswrapper[7479]: I0308 00:22:13.049985 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.053154 master-0 kubenswrapper[7479]: I0308 00:22:13.050586 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.053154 master-0 kubenswrapper[7479]: I0308 00:22:13.052835 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.070645 master-0 kubenswrapper[7479]: I0308 00:22:13.070607 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.077608 master-0 kubenswrapper[7479]: I0308 00:22:13.077576 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.138499 master-0 kubenswrapper[7479]: I0308 00:22:13.138442 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.145047 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.145519 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.145563 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.145654 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.145770 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.146156 master-0 kubenswrapper[7479]: I0308 00:22:13.146128 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.168171 master-0 kubenswrapper[7479]: I0308 00:22:13.168133 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.249850 master-0 kubenswrapper[7479]: I0308 00:22:13.249749 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:13.280832 master-0 kubenswrapper[7479]: I0308 00:22:13.280783 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:22:13.360109 master-0 kubenswrapper[7479]: I0308 00:22:13.359421 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerStarted","Data":"1e770f05b7d4f3abd180562cc940e1a0486ee998d3fc21227af26eb82314570e"} Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361163 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_bb5babcb-5fef-44dc-b9e1-87e09a2b31c4/installer/0.log" Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361189 7479 generic.go:334] "Generic (PLEG): container finished" podID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" containerID="1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61" exitCode=2 Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361231 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4","Type":"ContainerDied","Data":"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61"} Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361246 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"bb5babcb-5fef-44dc-b9e1-87e09a2b31c4","Type":"ContainerDied","Data":"87f0590efacfbdd11880d4aaab66640abded535b4b553b6b3da74e7ad35cedda"} Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361263 7479 scope.go:117] "RemoveContainer" containerID="1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61" Mar 08 00:22:13.362377 master-0 kubenswrapper[7479]: I0308 00:22:13.361343 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 00:22:13.374277 master-0 kubenswrapper[7479]: I0308 00:22:13.374166 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"b08200bbfa16b2def7e8e435dbba2b2fcca8a8d3de5ace290d9e40ef68f64f02"} Mar 08 00:22:13.375896 master-0 kubenswrapper[7479]: I0308 00:22:13.375864 7479 generic.go:334] "Generic (PLEG): container finished" podID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" containerID="9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66" exitCode=0 Mar 08 00:22:13.375973 master-0 kubenswrapper[7479]: I0308 00:22:13.375946 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" event={"ID":"5837befc-f6e9-4f74-ae39-d0aec977f0c9","Type":"ContainerDied","Data":"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66"} Mar 08 00:22:13.376010 master-0 kubenswrapper[7479]: I0308 00:22:13.375982 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" event={"ID":"5837befc-f6e9-4f74-ae39-d0aec977f0c9","Type":"ContainerDied","Data":"4e7d3332a4fd54ae4a295945e29cf38ecb93886fa54a4b5efb0807b00cced883"} Mar 08 00:22:13.376089 master-0 kubenswrapper[7479]: I0308 00:22:13.376074 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8597858f97-kb2l8" Mar 08 00:22:13.379128 master-0 kubenswrapper[7479]: I0308 00:22:13.379085 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7"} Mar 08 00:22:13.380015 master-0 kubenswrapper[7479]: I0308 00:22:13.379985 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:13.387927 master-0 kubenswrapper[7479]: I0308 00:22:13.387658 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerStarted","Data":"b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768"} Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.393016 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerStarted","Data":"40763ecf359c193fdc57eccfc3f99287edfc631f03df7363e0563b373121c528"} Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.393064 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerStarted","Data":"f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a"} Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.396583 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.399755 7479 scope.go:117] "RemoveContainer" containerID="1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61" Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: E0308 00:22:13.412474 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61\": container with ID starting with 1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61 not found: ID does not exist" containerID="1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61" Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.412525 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61"} err="failed to get container status \"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61\": rpc error: code = NotFound desc = could not find container \"1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61\": container with ID starting with 1a283172b2fc3bdc1ae37b0164a7c4527637bda97f460038813ba490be7f7c61 not found: ID does not exist" Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.412558 7479 scope.go:117] "RemoveContainer" containerID="9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66" Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.414046 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" podStartSLOduration=6.649984046 podStartE2EDuration="21.414036401s" podCreationTimestamp="2026-03-08 00:21:52 +0000 UTC" firstStartedPulling="2026-03-08 00:21:55.921960977 +0000 UTC m=+32.234869894" lastFinishedPulling="2026-03-08 00:22:10.686013342 +0000 UTC m=+46.998922249" observedRunningTime="2026-03-08 00:22:13.412870113 +0000 UTC m=+49.725779030" watchObservedRunningTime="2026-03-08 00:22:13.414036401 +0000 UTC m=+49.726945318" Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.415282 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"288a9f605fdc9bb30bb45ee47783409a88bbd8f20083c4f59dc94085a87e4e3b"} Mar 08 00:22:13.421258 master-0 kubenswrapper[7479]: I0308 00:22:13.415476 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:13.425462 master-0 kubenswrapper[7479]: I0308 00:22:13.422782 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerStarted","Data":"053ec9ee75c18a0fbe26d2f98131f6f6b38d1545596ef812b5dd85b824a65cfd"} Mar 08 00:22:13.443482 master-0 kubenswrapper[7479]: W0308 00:22:13.439824 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b94e1ca_5aef_49ae_928e_29cc0ce81d61.slice/crio-fbbedadab3e325405c3103b757378d37ed57beb86fa4dc9dfbd4a453372d9d42 WatchSource:0}: Error finding container fbbedadab3e325405c3103b757378d37ed57beb86fa4dc9dfbd4a453372d9d42: Status 404 returned error can't find the container with id fbbedadab3e325405c3103b757378d37ed57beb86fa4dc9dfbd4a453372d9d42 Mar 08 00:22:13.457668 master-0 kubenswrapper[7479]: I0308 00:22:13.451674 7479 scope.go:117] "RemoveContainer" containerID="9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66" Mar 08 00:22:13.457841 master-0 kubenswrapper[7479]: I0308 00:22:13.457647 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:13.457841 master-0 kubenswrapper[7479]: E0308 00:22:13.457783 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66\": container with ID starting with 9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66 not found: ID does not exist" containerID="9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66" Mar 08 00:22:13.457841 master-0 kubenswrapper[7479]: I0308 00:22:13.457810 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66"} err="failed to get container status \"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66\": rpc error: code = NotFound desc = could not find container \"9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66\": container with ID starting with 9623ed4c8f230e3a1aaaefab21bf5eb1497c6318c5eb21f31af3f6cfca0b9a66 not found: ID does not exist" Mar 08 00:22:13.459910 master-0 kubenswrapper[7479]: I0308 00:22:13.459874 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"b2496d08ba7d24c47b88064d6a60a25e9b169662cfe39cc7b5569d25f4f5e236"} Mar 08 00:22:13.459972 master-0 kubenswrapper[7479]: I0308 00:22:13.459919 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"6eaa4eebadf626880d254857e0b5071188feb8436fd6122d3cce0a00f572ec73"} Mar 08 00:22:13.470608 master-0 kubenswrapper[7479]: I0308 00:22:13.470578 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 00:22:13.494822 master-0 kubenswrapper[7479]: I0308 00:22:13.494761 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" podStartSLOduration=1.494747656 podStartE2EDuration="1.494747656s" podCreationTimestamp="2026-03-08 00:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:13.493593299 +0000 UTC m=+49.806502226" watchObservedRunningTime="2026-03-08 00:22:13.494747656 +0000 UTC m=+49.807656573" Mar 08 00:22:13.522251 master-0 kubenswrapper[7479]: I0308 00:22:13.521653 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" podStartSLOduration=10.521635568 podStartE2EDuration="10.521635568s" podCreationTimestamp="2026-03-08 00:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:13.520539102 +0000 UTC m=+49.833448019" watchObservedRunningTime="2026-03-08 00:22:13.521635568 +0000 UTC m=+49.834544485" Mar 08 00:22:13.570045 master-0 kubenswrapper[7479]: I0308 00:22:13.570007 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:22:13.570273 master-0 kubenswrapper[7479]: I0308 00:22:13.570261 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8597858f97-kb2l8"] Mar 08 00:22:13.586583 master-0 kubenswrapper[7479]: I0308 00:22:13.586534 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" podStartSLOduration=11.586498884000001 podStartE2EDuration="11.586498884s" podCreationTimestamp="2026-03-08 00:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:13.578494271 +0000 UTC m=+49.891403188" watchObservedRunningTime="2026-03-08 00:22:13.586498884 +0000 UTC m=+49.899407801" Mar 08 00:22:13.600687 master-0 kubenswrapper[7479]: I0308 00:22:13.599749 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podStartSLOduration=7.681786379 podStartE2EDuration="22.599732657s" podCreationTimestamp="2026-03-08 00:21:51 +0000 UTC" firstStartedPulling="2026-03-08 00:21:55.869102836 +0000 UTC m=+32.182011753" lastFinishedPulling="2026-03-08 00:22:10.787049114 +0000 UTC m=+47.099958031" observedRunningTime="2026-03-08 00:22:13.597774513 +0000 UTC m=+49.910683430" watchObservedRunningTime="2026-03-08 00:22:13.599732657 +0000 UTC m=+49.912641574" Mar 08 00:22:13.615905 master-0 kubenswrapper[7479]: I0308 00:22:13.615864 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:22:13.725852 master-0 kubenswrapper[7479]: I0308 00:22:13.725795 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:22:13.731977 master-0 kubenswrapper[7479]: W0308 00:22:13.731919 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbcb0196_be5c_44a4_9749_5df9fbeaa718.slice/crio-ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97 WatchSource:0}: Error finding container ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97: Status 404 returned error can't find the container with id ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97 Mar 08 00:22:13.820562 master-0 kubenswrapper[7479]: I0308 00:22:13.820518 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:22:13.834652 master-0 kubenswrapper[7479]: W0308 00:22:13.834006 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5df57519_dc14_4d18_8c24_cf2e6e122cff.slice/crio-44241d792c3cd70ffcac7a4439189c04cf4aca10694e440b3450c0a02c69f625 WatchSource:0}: Error finding container 44241d792c3cd70ffcac7a4439189c04cf4aca10694e440b3450c0a02c69f625: Status 404 returned error can't find the container with id 44241d792c3cd70ffcac7a4439189c04cf4aca10694e440b3450c0a02c69f625 Mar 08 00:22:13.916189 master-0 kubenswrapper[7479]: I0308 00:22:13.916131 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c19760-2cb2-4690-be8e-cba3c517c60e" path="/var/lib/kubelet/pods/32c19760-2cb2-4690-be8e-cba3c517c60e/volumes" Mar 08 00:22:13.916704 master-0 kubenswrapper[7479]: I0308 00:22:13.916676 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426e2fcf-dfb6-4193-91ae-c6daef6e50b1" path="/var/lib/kubelet/pods/426e2fcf-dfb6-4193-91ae-c6daef6e50b1/volumes" Mar 08 00:22:13.917224 master-0 kubenswrapper[7479]: I0308 00:22:13.917175 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5837befc-f6e9-4f74-ae39-d0aec977f0c9" path="/var/lib/kubelet/pods/5837befc-f6e9-4f74-ae39-d0aec977f0c9/volumes" Mar 08 00:22:13.918036 master-0 kubenswrapper[7479]: I0308 00:22:13.918007 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb5babcb-5fef-44dc-b9e1-87e09a2b31c4" path="/var/lib/kubelet/pods/bb5babcb-5fef-44dc-b9e1-87e09a2b31c4/volumes" Mar 08 00:22:14.156611 master-0 kubenswrapper[7479]: I0308 00:22:14.156492 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:22:14.157323 master-0 kubenswrapper[7479]: I0308 00:22:14.157302 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.218151 master-0 kubenswrapper[7479]: I0308 00:22:14.218100 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:22:14.267221 master-0 kubenswrapper[7479]: I0308 00:22:14.262859 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.267221 master-0 kubenswrapper[7479]: I0308 00:22:14.263023 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.267221 master-0 kubenswrapper[7479]: I0308 00:22:14.263126 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnf7\" (UniqueName: \"kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.363930 master-0 kubenswrapper[7479]: I0308 00:22:14.363866 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.364116 master-0 kubenswrapper[7479]: I0308 00:22:14.363945 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.364116 master-0 kubenswrapper[7479]: I0308 00:22:14.363982 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnf7\" (UniqueName: \"kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.364540 master-0 kubenswrapper[7479]: I0308 00:22:14.364416 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.364711 master-0 kubenswrapper[7479]: I0308 00:22:14.364674 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.379842 master-0 kubenswrapper[7479]: I0308 00:22:14.379800 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnf7\" (UniqueName: \"kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7\") pod \"redhat-marketplace-4r9ht\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.466530 master-0 kubenswrapper[7479]: I0308 00:22:14.466419 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"5df57519-dc14-4d18-8c24-cf2e6e122cff","Type":"ContainerStarted","Data":"8d418bba96a10317f3ab381a296f7b477288b766d8262911dbed8676ce28b625"} Mar 08 00:22:14.466530 master-0 kubenswrapper[7479]: I0308 00:22:14.466471 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"5df57519-dc14-4d18-8c24-cf2e6e122cff","Type":"ContainerStarted","Data":"44241d792c3cd70ffcac7a4439189c04cf4aca10694e440b3450c0a02c69f625"} Mar 08 00:22:14.469116 master-0 kubenswrapper[7479]: I0308 00:22:14.468906 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:14.469276 master-0 kubenswrapper[7479]: I0308 00:22:14.469243 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerStarted","Data":"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd"} Mar 08 00:22:14.469318 master-0 kubenswrapper[7479]: I0308 00:22:14.469281 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerStarted","Data":"1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00"} Mar 08 00:22:14.469830 master-0 kubenswrapper[7479]: I0308 00:22:14.469812 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:14.476119 master-0 kubenswrapper[7479]: I0308 00:22:14.476083 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:22:14.486494 master-0 kubenswrapper[7479]: I0308 00:22:14.486440 7479 generic.go:334] "Generic (PLEG): container finished" podID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerID="3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb" exitCode=0 Mar 08 00:22:14.486630 master-0 kubenswrapper[7479]: I0308 00:22:14.486530 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerDied","Data":"3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb"} Mar 08 00:22:14.486630 master-0 kubenswrapper[7479]: I0308 00:22:14.486575 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerStarted","Data":"fbbedadab3e325405c3103b757378d37ed57beb86fa4dc9dfbd4a453372d9d42"} Mar 08 00:22:14.489104 master-0 kubenswrapper[7479]: I0308 00:22:14.489036 7479 generic.go:334] "Generic (PLEG): container finished" podID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerID="e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e" exitCode=0 Mar 08 00:22:14.489275 master-0 kubenswrapper[7479]: I0308 00:22:14.489232 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerDied","Data":"e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e"} Mar 08 00:22:14.501321 master-0 kubenswrapper[7479]: I0308 00:22:14.499367 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerStarted","Data":"92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807"} Mar 08 00:22:14.501321 master-0 kubenswrapper[7479]: I0308 00:22:14.499415 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerStarted","Data":"ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97"} Mar 08 00:22:14.501751 master-0 kubenswrapper[7479]: I0308 00:22:14.501733 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:14.502073 master-0 kubenswrapper[7479]: I0308 00:22:14.501820 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=2.50178721 podStartE2EDuration="2.50178721s" podCreationTimestamp="2026-03-08 00:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:14.494532443 +0000 UTC m=+50.807441370" watchObservedRunningTime="2026-03-08 00:22:14.50178721 +0000 UTC m=+50.814696167" Mar 08 00:22:14.529225 master-0 kubenswrapper[7479]: I0308 00:22:14.511655 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:22:14.595222 master-0 kubenswrapper[7479]: I0308 00:22:14.593799 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" podStartSLOduration=8.593779385 podStartE2EDuration="8.593779385s" podCreationTimestamp="2026-03-08 00:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:14.535446894 +0000 UTC m=+50.848355851" watchObservedRunningTime="2026-03-08 00:22:14.593779385 +0000 UTC m=+50.906688302" Mar 08 00:22:14.651886 master-0 kubenswrapper[7479]: I0308 00:22:14.651634 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podStartSLOduration=8.651619011 podStartE2EDuration="8.651619011s" podCreationTimestamp="2026-03-08 00:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:22:14.648381745 +0000 UTC m=+50.961290662" watchObservedRunningTime="2026-03-08 00:22:14.651619011 +0000 UTC m=+50.964527928" Mar 08 00:22:14.982501 master-0 kubenswrapper[7479]: I0308 00:22:14.982445 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:22:15.219520 master-0 kubenswrapper[7479]: I0308 00:22:15.219436 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:22:15.220369 master-0 kubenswrapper[7479]: I0308 00:22:15.220314 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.256133 master-0 kubenswrapper[7479]: I0308 00:22:15.256066 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:22:15.274993 master-0 kubenswrapper[7479]: I0308 00:22:15.274936 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.275167 master-0 kubenswrapper[7479]: I0308 00:22:15.275070 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.275167 master-0 kubenswrapper[7479]: I0308 00:22:15.275161 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t44t4\" (UniqueName: \"kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.376465 master-0 kubenswrapper[7479]: I0308 00:22:15.376394 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.376730 master-0 kubenswrapper[7479]: I0308 00:22:15.376680 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.376851 master-0 kubenswrapper[7479]: I0308 00:22:15.376818 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t44t4\" (UniqueName: \"kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.377008 master-0 kubenswrapper[7479]: I0308 00:22:15.376980 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.377513 master-0 kubenswrapper[7479]: I0308 00:22:15.377480 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:15.511391 master-0 kubenswrapper[7479]: I0308 00:22:15.511344 7479 generic.go:334] "Generic (PLEG): container finished" podID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerID="989c0be29898f604cd52cd2114aa3064cf0c55ea5a9ce0b189962fd1f75c107c" exitCode=0 Mar 08 00:22:15.511603 master-0 kubenswrapper[7479]: I0308 00:22:15.511445 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerDied","Data":"989c0be29898f604cd52cd2114aa3064cf0c55ea5a9ce0b189962fd1f75c107c"} Mar 08 00:22:15.511603 master-0 kubenswrapper[7479]: I0308 00:22:15.511486 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerStarted","Data":"6522e09e0271dba6e7e1bcdc92fb3a4714286d0628b2288932b8a0a7d3281419"} Mar 08 00:22:16.578586 master-0 kubenswrapper[7479]: I0308 00:22:16.578542 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t44t4\" (UniqueName: \"kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4\") pod \"redhat-operators-mr22p\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:16.750136 master-0 kubenswrapper[7479]: I0308 00:22:16.750073 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:17.626688 master-0 kubenswrapper[7479]: I0308 00:22:17.626156 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:22:17.626688 master-0 kubenswrapper[7479]: I0308 00:22:17.626629 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:22:18.253642 master-0 kubenswrapper[7479]: I0308 00:22:18.253574 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:22:18.253642 master-0 kubenswrapper[7479]: I0308 00:22:18.253629 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:22:21.675950 master-0 kubenswrapper[7479]: I0308 00:22:21.675892 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: I0308 00:22:21.987665 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: I0308 00:22:21.993391 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]etcd ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: I0308 00:22:21.993443 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:22:22.010485 master-0 kubenswrapper[7479]: I0308 00:22:21.998173 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:22:23.614342 master-0 kubenswrapper[7479]: I0308 00:22:23.614302 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:22:24.761872 master-0 kubenswrapper[7479]: I0308 00:22:24.757729 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:22:25.143859 master-0 kubenswrapper[7479]: I0308 00:22:25.140616 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:22:25.157749 master-0 kubenswrapper[7479]: W0308 00:22:25.157705 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07f9c188_df80_4606_9a21_72228cffa706.slice/crio-9813fb2b0913beedd59707dab5262a0c2df306a822641a8265719695a9f73624 WatchSource:0}: Error finding container 9813fb2b0913beedd59707dab5262a0c2df306a822641a8265719695a9f73624: Status 404 returned error can't find the container with id 9813fb2b0913beedd59707dab5262a0c2df306a822641a8265719695a9f73624 Mar 08 00:22:25.777902 master-0 kubenswrapper[7479]: I0308 00:22:25.777842 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerStarted","Data":"9813fb2b0913beedd59707dab5262a0c2df306a822641a8265719695a9f73624"} Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: I0308 00:22:25.931692 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]etcd ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:22:25.931817 master-0 kubenswrapper[7479]: I0308 00:22:25.931763 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:22:26.784605 master-0 kubenswrapper[7479]: I0308 00:22:26.784559 7479 generic.go:334] "Generic (PLEG): container finished" podID="07f9c188-df80-4606-9a21-72228cffa706" containerID="422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967" exitCode=0 Mar 08 00:22:26.784605 master-0 kubenswrapper[7479]: I0308 00:22:26.784605 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerDied","Data":"422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967"} Mar 08 00:22:28.352830 master-0 kubenswrapper[7479]: I0308 00:22:28.352436 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:22:28.352830 master-0 kubenswrapper[7479]: I0308 00:22:28.352680 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-4-master-0" podUID="5df57519-dc14-4d18-8c24-cf2e6e122cff" containerName="installer" containerID="cri-o://8d418bba96a10317f3ab381a296f7b477288b766d8262911dbed8676ce28b625" gracePeriod=30 Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: I0308 00:22:28.353299 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]etcd ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:22:28.353434 master-0 kubenswrapper[7479]: I0308 00:22:28.353354 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:22:28.408180 master-0 kubenswrapper[7479]: I0308 00:22:28.408132 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:22:28.408660 master-0 kubenswrapper[7479]: I0308 00:22:28.408608 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" containerName="installer" containerID="cri-o://15d74f4f21139026e1f17a65ff4323887705b24a5c623f5887aaa69f1485ac9b" gracePeriod=30 Mar 08 00:22:28.581512 master-0 kubenswrapper[7479]: I0308 00:22:28.579267 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 00:22:28.581512 master-0 kubenswrapper[7479]: I0308 00:22:28.581451 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.623280 master-0 kubenswrapper[7479]: I0308 00:22:28.622741 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 00:22:28.713038 master-0 kubenswrapper[7479]: I0308 00:22:28.712779 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.713038 master-0 kubenswrapper[7479]: I0308 00:22:28.712846 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.713038 master-0 kubenswrapper[7479]: I0308 00:22:28.712929 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.818057 master-0 kubenswrapper[7479]: I0308 00:22:28.817800 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.818057 master-0 kubenswrapper[7479]: I0308 00:22:28.817855 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.818057 master-0 kubenswrapper[7479]: I0308 00:22:28.817880 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.818057 master-0 kubenswrapper[7479]: I0308 00:22:28.817974 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.818057 master-0 kubenswrapper[7479]: I0308 00:22:28.818009 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.918904 master-0 kubenswrapper[7479]: I0308 00:22:28.918161 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:28.929324 master-0 kubenswrapper[7479]: I0308 00:22:28.929280 7479 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 00:22:28.929564 master-0 kubenswrapper[7479]: I0308 00:22:28.929537 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://b999c6f84ef35141ea9d9157df896d14bb08340f5b7476591f3ed6362f2a6196" gracePeriod=30 Mar 08 00:22:28.929564 master-0 kubenswrapper[7479]: I0308 00:22:28.929528 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://da60beba23659d143e9020dc0409825d88a4d10b35b445c12b13ae8fc1310bdf" gracePeriod=30 Mar 08 00:22:28.931058 master-0 kubenswrapper[7479]: I0308 00:22:28.931038 7479 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:22:28.931379 master-0 kubenswrapper[7479]: E0308 00:22:28.931364 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 00:22:28.931463 master-0 kubenswrapper[7479]: I0308 00:22:28.931452 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 00:22:28.931552 master-0 kubenswrapper[7479]: E0308 00:22:28.931541 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 00:22:28.931619 master-0 kubenswrapper[7479]: I0308 00:22:28.931609 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 00:22:28.931839 master-0 kubenswrapper[7479]: I0308 00:22:28.931827 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 00:22:28.931909 master-0 kubenswrapper[7479]: I0308 00:22:28.931900 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 00:22:28.939636 master-0 kubenswrapper[7479]: I0308 00:22:28.939599 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122135 master-0 kubenswrapper[7479]: I0308 00:22:29.122065 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122135 master-0 kubenswrapper[7479]: I0308 00:22:29.122124 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122408 master-0 kubenswrapper[7479]: I0308 00:22:29.122148 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122408 master-0 kubenswrapper[7479]: I0308 00:22:29.122174 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122408 master-0 kubenswrapper[7479]: I0308 00:22:29.122268 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.122514 master-0 kubenswrapper[7479]: I0308 00:22:29.122416 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.210756 master-0 kubenswrapper[7479]: I0308 00:22:29.210635 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:22:29.223740 master-0 kubenswrapper[7479]: I0308 00:22:29.223710 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223807 master-0 kubenswrapper[7479]: I0308 00:22:29.223742 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223807 master-0 kubenswrapper[7479]: I0308 00:22:29.223760 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223807 master-0 kubenswrapper[7479]: I0308 00:22:29.223777 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223807 master-0 kubenswrapper[7479]: I0308 00:22:29.223793 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223923 master-0 kubenswrapper[7479]: I0308 00:22:29.223819 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223923 master-0 kubenswrapper[7479]: I0308 00:22:29.223889 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.223923 master-0 kubenswrapper[7479]: I0308 00:22:29.223923 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.224011 master-0 kubenswrapper[7479]: I0308 00:22:29.223944 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.224011 master-0 kubenswrapper[7479]: I0308 00:22:29.223963 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.224011 master-0 kubenswrapper[7479]: I0308 00:22:29.223983 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:29.224011 master-0 kubenswrapper[7479]: I0308 00:22:29.224003 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:22:31.806964 master-0 kubenswrapper[7479]: I0308 00:22:31.806889 7479 generic.go:334] "Generic (PLEG): container finished" podID="4217b755-ca87-45cf-9e52-7b2681660f41" containerID="6c847624822fb2ae11b6027b5155999eb848a04181b2d105ba183b9e9a68d9b4" exitCode=0 Mar 08 00:22:31.806964 master-0 kubenswrapper[7479]: I0308 00:22:31.806933 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerDied","Data":"6c847624822fb2ae11b6027b5155999eb848a04181b2d105ba183b9e9a68d9b4"} Mar 08 00:22:33.716120 master-0 kubenswrapper[7479]: I0308 00:22:33.716053 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 00:22:33.815508 master-0 kubenswrapper[7479]: I0308 00:22:33.815289 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 00:22:33.815508 master-0 kubenswrapper[7479]: I0308 00:22:33.815186 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerDied","Data":"dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b"} Mar 08 00:22:33.815508 master-0 kubenswrapper[7479]: I0308 00:22:33.815451 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b" Mar 08 00:22:33.860299 master-0 kubenswrapper[7479]: I0308 00:22:33.860235 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock\") pod \"4217b755-ca87-45cf-9e52-7b2681660f41\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " Mar 08 00:22:33.860497 master-0 kubenswrapper[7479]: I0308 00:22:33.860366 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock" (OuterVolumeSpecName: "var-lock") pod "4217b755-ca87-45cf-9e52-7b2681660f41" (UID: "4217b755-ca87-45cf-9e52-7b2681660f41"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:33.860497 master-0 kubenswrapper[7479]: I0308 00:22:33.860397 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir\") pod \"4217b755-ca87-45cf-9e52-7b2681660f41\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " Mar 08 00:22:33.860497 master-0 kubenswrapper[7479]: I0308 00:22:33.860448 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access\") pod \"4217b755-ca87-45cf-9e52-7b2681660f41\" (UID: \"4217b755-ca87-45cf-9e52-7b2681660f41\") " Mar 08 00:22:33.860702 master-0 kubenswrapper[7479]: I0308 00:22:33.860514 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4217b755-ca87-45cf-9e52-7b2681660f41" (UID: "4217b755-ca87-45cf-9e52-7b2681660f41"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:33.860794 master-0 kubenswrapper[7479]: I0308 00:22:33.860767 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:33.860794 master-0 kubenswrapper[7479]: I0308 00:22:33.860783 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4217b755-ca87-45cf-9e52-7b2681660f41-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:33.866682 master-0 kubenswrapper[7479]: I0308 00:22:33.866638 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4217b755-ca87-45cf-9e52-7b2681660f41" (UID: "4217b755-ca87-45cf-9e52-7b2681660f41"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:33.962079 master-0 kubenswrapper[7479]: I0308 00:22:33.961969 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4217b755-ca87-45cf-9e52-7b2681660f41-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:38.837673 master-0 kubenswrapper[7479]: I0308 00:22:38.837626 7479 generic.go:334] "Generic (PLEG): container finished" podID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerID="490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3" exitCode=0 Mar 08 00:22:38.838174 master-0 kubenswrapper[7479]: I0308 00:22:38.837671 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerDied","Data":"490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3"} Mar 08 00:22:38.839982 master-0 kubenswrapper[7479]: I0308 00:22:38.839331 7479 generic.go:334] "Generic (PLEG): container finished" podID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerID="d6ae963fd70f7061dfef7c8b6ee26bdbd4f75ddaaff7d7835ce22ba22a0fa9c1" exitCode=0 Mar 08 00:22:38.839982 master-0 kubenswrapper[7479]: I0308 00:22:38.839409 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerDied","Data":"d6ae963fd70f7061dfef7c8b6ee26bdbd4f75ddaaff7d7835ce22ba22a0fa9c1"} Mar 08 00:22:38.841409 master-0 kubenswrapper[7479]: I0308 00:22:38.841383 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/0.log" Mar 08 00:22:38.841461 master-0 kubenswrapper[7479]: I0308 00:22:38.841423 7479 generic.go:334] "Generic (PLEG): container finished" podID="ef0a3c84-98bb-4915-9010-d66fcbeafe09" containerID="ba0bd870ef36ff11021b6ac2e87095fcc7b137992295cf86faa86e55d1530ce8" exitCode=1 Mar 08 00:22:38.841492 master-0 kubenswrapper[7479]: I0308 00:22:38.841476 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerDied","Data":"ba0bd870ef36ff11021b6ac2e87095fcc7b137992295cf86faa86e55d1530ce8"} Mar 08 00:22:38.841718 master-0 kubenswrapper[7479]: I0308 00:22:38.841705 7479 scope.go:117] "RemoveContainer" containerID="ba0bd870ef36ff11021b6ac2e87095fcc7b137992295cf86faa86e55d1530ce8" Mar 08 00:22:38.844476 master-0 kubenswrapper[7479]: I0308 00:22:38.843032 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerStarted","Data":"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec"} Mar 08 00:22:38.846482 master-0 kubenswrapper[7479]: I0308 00:22:38.846451 7479 generic.go:334] "Generic (PLEG): container finished" podID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerID="83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b" exitCode=0 Mar 08 00:22:38.846549 master-0 kubenswrapper[7479]: I0308 00:22:38.846481 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerDied","Data":"83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b"} Mar 08 00:22:39.855850 master-0 kubenswrapper[7479]: I0308 00:22:39.855751 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/0.log" Mar 08 00:22:39.856658 master-0 kubenswrapper[7479]: I0308 00:22:39.855898 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569"} Mar 08 00:22:39.858354 master-0 kubenswrapper[7479]: I0308 00:22:39.858318 7479 generic.go:334] "Generic (PLEG): container finished" podID="07f9c188-df80-4606-9a21-72228cffa706" containerID="345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec" exitCode=0 Mar 08 00:22:39.858430 master-0 kubenswrapper[7479]: I0308 00:22:39.858371 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerDied","Data":"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec"} Mar 08 00:22:40.722793 master-0 kubenswrapper[7479]: I0308 00:22:40.722734 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:40.722974 master-0 kubenswrapper[7479]: I0308 00:22:40.722791 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:41.962912 master-0 kubenswrapper[7479]: E0308 00:22:41.962819 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 00:22:41.963579 master-0 kubenswrapper[7479]: I0308 00:22:41.963530 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: I0308 00:22:42.260866 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:22:42.260923 master-0 kubenswrapper[7479]: I0308 00:22:42.260920 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:22:42.876302 master-0 kubenswrapper[7479]: I0308 00:22:42.876073 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerStarted","Data":"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03"} Mar 08 00:22:42.879991 master-0 kubenswrapper[7479]: I0308 00:22:42.879858 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_ada20442-bff5-477c-989e-3d921f5ede5e/installer/0.log" Mar 08 00:22:42.879991 master-0 kubenswrapper[7479]: I0308 00:22:42.879901 7479 generic.go:334] "Generic (PLEG): container finished" podID="ada20442-bff5-477c-989e-3d921f5ede5e" containerID="15d74f4f21139026e1f17a65ff4323887705b24a5c623f5887aaa69f1485ac9b" exitCode=1 Mar 08 00:22:42.879991 master-0 kubenswrapper[7479]: I0308 00:22:42.879946 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"ada20442-bff5-477c-989e-3d921f5ede5e","Type":"ContainerDied","Data":"15d74f4f21139026e1f17a65ff4323887705b24a5c623f5887aaa69f1485ac9b"} Mar 08 00:22:42.882836 master-0 kubenswrapper[7479]: I0308 00:22:42.882311 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerStarted","Data":"936db645ff2b40de0fcbea2669720f0e2d16e56c3a9987fff0ee1a1cff12a3c2"} Mar 08 00:22:42.890239 master-0 kubenswrapper[7479]: I0308 00:22:42.889683 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerStarted","Data":"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0"} Mar 08 00:22:42.894236 master-0 kubenswrapper[7479]: I0308 00:22:42.891616 7479 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="182e67e6b82b83c4d47d4c01d3dcbdede2056c9bcdcf8367c8a6959d0eeac8ea" exitCode=0 Mar 08 00:22:42.894236 master-0 kubenswrapper[7479]: I0308 00:22:42.891675 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"182e67e6b82b83c4d47d4c01d3dcbdede2056c9bcdcf8367c8a6959d0eeac8ea"} Mar 08 00:22:42.894236 master-0 kubenswrapper[7479]: I0308 00:22:42.891696 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ce67cd1e37e90c976b5eb1d98a8adbdd3c36380a0d4d75edb38584db8eeda1f5"} Mar 08 00:22:42.894236 master-0 kubenswrapper[7479]: I0308 00:22:42.894106 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerStarted","Data":"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269"} Mar 08 00:22:42.970443 master-0 kubenswrapper[7479]: I0308 00:22:42.965381 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:42.977246 master-0 kubenswrapper[7479]: I0308 00:22:42.971567 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:43.005289 master-0 kubenswrapper[7479]: I0308 00:22:43.005194 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:43.005289 master-0 kubenswrapper[7479]: I0308 00:22:43.005283 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:43.032527 master-0 kubenswrapper[7479]: I0308 00:22:43.032395 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_ada20442-bff5-477c-989e-3d921f5ede5e/installer/0.log" Mar 08 00:22:43.032527 master-0 kubenswrapper[7479]: I0308 00:22:43.032460 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:43.047696 master-0 kubenswrapper[7479]: I0308 00:22:43.047624 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:43.047696 master-0 kubenswrapper[7479]: I0308 00:22:43.047661 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173008 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock\") pod \"ada20442-bff5-477c-989e-3d921f5ede5e\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173046 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir\") pod \"ada20442-bff5-477c-989e-3d921f5ede5e\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173089 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access\") pod \"ada20442-bff5-477c-989e-3d921f5ede5e\" (UID: \"ada20442-bff5-477c-989e-3d921f5ede5e\") " Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173143 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock" (OuterVolumeSpecName: "var-lock") pod "ada20442-bff5-477c-989e-3d921f5ede5e" (UID: "ada20442-bff5-477c-989e-3d921f5ede5e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173232 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ada20442-bff5-477c-989e-3d921f5ede5e" (UID: "ada20442-bff5-477c-989e-3d921f5ede5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173311 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:43.173401 master-0 kubenswrapper[7479]: I0308 00:22:43.173327 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada20442-bff5-477c-989e-3d921f5ede5e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:43.175678 master-0 kubenswrapper[7479]: I0308 00:22:43.175650 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ada20442-bff5-477c-989e-3d921f5ede5e" (UID: "ada20442-bff5-477c-989e-3d921f5ede5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:43.274780 master-0 kubenswrapper[7479]: I0308 00:22:43.274717 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada20442-bff5-477c-989e-3d921f5ede5e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:43.722491 master-0 kubenswrapper[7479]: I0308 00:22:43.722441 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:43.722710 master-0 kubenswrapper[7479]: I0308 00:22:43.722493 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:43.899376 master-0 kubenswrapper[7479]: I0308 00:22:43.899337 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_ada20442-bff5-477c-989e-3d921f5ede5e/installer/0.log" Mar 08 00:22:43.899540 master-0 kubenswrapper[7479]: I0308 00:22:43.899511 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 00:22:43.899590 master-0 kubenswrapper[7479]: I0308 00:22:43.899535 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"ada20442-bff5-477c-989e-3d921f5ede5e","Type":"ContainerDied","Data":"690784e0df6abe7f6bb7d7b1b2637aaaba8b482bcfbe8880715b7a6a2d707f93"} Mar 08 00:22:43.899621 master-0 kubenswrapper[7479]: I0308 00:22:43.899596 7479 scope.go:117] "RemoveContainer" containerID="15d74f4f21139026e1f17a65ff4323887705b24a5c623f5887aaa69f1485ac9b" Mar 08 00:22:44.023676 master-0 kubenswrapper[7479]: I0308 00:22:44.023622 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ms5vp" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="registry-server" probeResult="failure" output=< Mar 08 00:22:44.023676 master-0 kubenswrapper[7479]: timeout: failed to connect service ":50051" within 1s Mar 08 00:22:44.023676 master-0 kubenswrapper[7479]: > Mar 08 00:22:44.085897 master-0 kubenswrapper[7479]: I0308 00:22:44.085841 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lqc4n" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="registry-server" probeResult="failure" output=< Mar 08 00:22:44.085897 master-0 kubenswrapper[7479]: timeout: failed to connect service ":50051" within 1s Mar 08 00:22:44.085897 master-0 kubenswrapper[7479]: > Mar 08 00:22:44.470155 master-0 kubenswrapper[7479]: I0308 00:22:44.470094 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:44.470155 master-0 kubenswrapper[7479]: I0308 00:22:44.470157 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:44.501363 master-0 kubenswrapper[7479]: I0308 00:22:44.501317 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:44.645517 master-0 kubenswrapper[7479]: I0308 00:22:44.645427 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 08 00:22:44.911179 master-0 kubenswrapper[7479]: I0308 00:22:44.911036 7479 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4" exitCode=1 Mar 08 00:22:44.911179 master-0 kubenswrapper[7479]: I0308 00:22:44.911100 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4"} Mar 08 00:22:44.912024 master-0 kubenswrapper[7479]: I0308 00:22:44.911972 7479 scope.go:117] "RemoveContainer" containerID="65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4" Mar 08 00:22:44.915114 master-0 kubenswrapper[7479]: I0308 00:22:44.915089 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_5df57519-dc14-4d18-8c24-cf2e6e122cff/installer/0.log" Mar 08 00:22:44.915308 master-0 kubenswrapper[7479]: I0308 00:22:44.915281 7479 generic.go:334] "Generic (PLEG): container finished" podID="5df57519-dc14-4d18-8c24-cf2e6e122cff" containerID="8d418bba96a10317f3ab381a296f7b477288b766d8262911dbed8676ce28b625" exitCode=1 Mar 08 00:22:44.915877 master-0 kubenswrapper[7479]: I0308 00:22:44.915393 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"5df57519-dc14-4d18-8c24-cf2e6e122cff","Type":"ContainerDied","Data":"8d418bba96a10317f3ab381a296f7b477288b766d8262911dbed8676ce28b625"} Mar 08 00:22:45.344767 master-0 kubenswrapper[7479]: E0308 00:22:45.344645 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:22:35Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:22:35Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:22:35Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:22:35Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"],\\\"sizeBytes\\\":438654375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7\\\"],\\\"sizeBytes\\\":411585608},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7\\\"],\\\"sizeBytes\\\":407347126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3\\\"],\\\"sizeBytes\\\":396521759}]}}\" for node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (patch nodes master-0)" Mar 08 00:22:45.792229 master-0 kubenswrapper[7479]: I0308 00:22:45.792182 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_5df57519-dc14-4d18-8c24-cf2e6e122cff/installer/0.log" Mar 08 00:22:45.792370 master-0 kubenswrapper[7479]: I0308 00:22:45.792274 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:45.808951 master-0 kubenswrapper[7479]: I0308 00:22:45.808918 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir\") pod \"5df57519-dc14-4d18-8c24-cf2e6e122cff\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " Mar 08 00:22:45.808951 master-0 kubenswrapper[7479]: I0308 00:22:45.808951 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock\") pod \"5df57519-dc14-4d18-8c24-cf2e6e122cff\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " Mar 08 00:22:45.809134 master-0 kubenswrapper[7479]: I0308 00:22:45.808996 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access\") pod \"5df57519-dc14-4d18-8c24-cf2e6e122cff\" (UID: \"5df57519-dc14-4d18-8c24-cf2e6e122cff\") " Mar 08 00:22:45.809134 master-0 kubenswrapper[7479]: I0308 00:22:45.809085 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock" (OuterVolumeSpecName: "var-lock") pod "5df57519-dc14-4d18-8c24-cf2e6e122cff" (UID: "5df57519-dc14-4d18-8c24-cf2e6e122cff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:45.809253 master-0 kubenswrapper[7479]: I0308 00:22:45.809232 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:45.809390 master-0 kubenswrapper[7479]: I0308 00:22:45.809344 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5df57519-dc14-4d18-8c24-cf2e6e122cff" (UID: "5df57519-dc14-4d18-8c24-cf2e6e122cff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:45.812016 master-0 kubenswrapper[7479]: I0308 00:22:45.811982 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5df57519-dc14-4d18-8c24-cf2e6e122cff" (UID: "5df57519-dc14-4d18-8c24-cf2e6e122cff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:45.910509 master-0 kubenswrapper[7479]: I0308 00:22:45.910466 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df57519-dc14-4d18-8c24-cf2e6e122cff-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:45.910509 master-0 kubenswrapper[7479]: I0308 00:22:45.910495 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df57519-dc14-4d18-8c24-cf2e6e122cff-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:45.921696 master-0 kubenswrapper[7479]: I0308 00:22:45.921667 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_5df57519-dc14-4d18-8c24-cf2e6e122cff/installer/0.log" Mar 08 00:22:45.921825 master-0 kubenswrapper[7479]: I0308 00:22:45.921747 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"5df57519-dc14-4d18-8c24-cf2e6e122cff","Type":"ContainerDied","Data":"44241d792c3cd70ffcac7a4439189c04cf4aca10694e440b3450c0a02c69f625"} Mar 08 00:22:45.921825 master-0 kubenswrapper[7479]: I0308 00:22:45.921784 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 00:22:45.921915 master-0 kubenswrapper[7479]: I0308 00:22:45.921791 7479 scope.go:117] "RemoveContainer" containerID="8d418bba96a10317f3ab381a296f7b477288b766d8262911dbed8676ce28b625" Mar 08 00:22:45.923746 master-0 kubenswrapper[7479]: I0308 00:22:45.923718 7479 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2" exitCode=1 Mar 08 00:22:45.923822 master-0 kubenswrapper[7479]: I0308 00:22:45.923770 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2"} Mar 08 00:22:45.924117 master-0 kubenswrapper[7479]: I0308 00:22:45.924088 7479 scope.go:117] "RemoveContainer" containerID="88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2" Mar 08 00:22:45.927036 master-0 kubenswrapper[7479]: I0308 00:22:45.927002 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e"} Mar 08 00:22:46.002447 master-0 kubenswrapper[7479]: I0308 00:22:46.002398 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:46.002788 master-0 kubenswrapper[7479]: I0308 00:22:46.002455 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:46.722990 master-0 kubenswrapper[7479]: I0308 00:22:46.722915 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:46.722990 master-0 kubenswrapper[7479]: I0308 00:22:46.722978 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:46.723867 master-0 kubenswrapper[7479]: I0308 00:22:46.723030 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:22:46.723867 master-0 kubenswrapper[7479]: I0308 00:22:46.723550 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 08 00:22:46.723867 master-0 kubenswrapper[7479]: I0308 00:22:46.723594 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" containerID="cri-o://155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" gracePeriod=30 Mar 08 00:22:46.750411 master-0 kubenswrapper[7479]: I0308 00:22:46.750357 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:46.750534 master-0 kubenswrapper[7479]: I0308 00:22:46.750449 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:46.933527 master-0 kubenswrapper[7479]: I0308 00:22:46.933423 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"0fe11e31bc3fff8b9610286a4d61bcdc774b24a696a35e7bd68af0798051cd1f"} Mar 08 00:22:46.990480 master-0 kubenswrapper[7479]: I0308 00:22:46.990421 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:22:47.324910 master-0 kubenswrapper[7479]: I0308 00:22:47.324848 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:22:47.798769 master-0 kubenswrapper[7479]: I0308 00:22:47.798705 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mr22p" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="registry-server" probeResult="failure" output=< Mar 08 00:22:47.798769 master-0 kubenswrapper[7479]: timeout: failed to connect service ":50051" within 1s Mar 08 00:22:47.798769 master-0 kubenswrapper[7479]: > Mar 08 00:22:47.919848 master-0 kubenswrapper[7479]: E0308 00:22:47.919773 7479 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:22:49.002865 master-0 kubenswrapper[7479]: I0308 00:22:49.002810 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:49.003430 master-0 kubenswrapper[7479]: I0308 00:22:49.002888 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:49.003430 master-0 kubenswrapper[7479]: I0308 00:22:49.002995 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: I0308 00:22:51.266278 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:22:51.266350 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:22:51.267308 master-0 kubenswrapper[7479]: I0308 00:22:51.266369 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:22:52.002868 master-0 kubenswrapper[7479]: I0308 00:22:52.002811 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:52.003111 master-0 kubenswrapper[7479]: I0308 00:22:52.002883 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:53.022399 master-0 kubenswrapper[7479]: I0308 00:22:53.022358 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:53.061574 master-0 kubenswrapper[7479]: I0308 00:22:53.061539 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:22:53.101733 master-0 kubenswrapper[7479]: I0308 00:22:53.101690 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:53.134701 master-0 kubenswrapper[7479]: I0308 00:22:53.134660 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:22:54.159627 master-0 kubenswrapper[7479]: I0308 00:22:54.159552 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:22:54.509212 master-0 kubenswrapper[7479]: I0308 00:22:54.508975 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:22:55.002034 master-0 kubenswrapper[7479]: I0308 00:22:55.001950 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:55.002304 master-0 kubenswrapper[7479]: I0308 00:22:55.002035 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:55.345828 master-0 kubenswrapper[7479]: E0308 00:22:55.345711 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 08 00:22:55.896044 master-0 kubenswrapper[7479]: E0308 00:22:55.896005 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 00:22:56.801579 master-0 kubenswrapper[7479]: I0308 00:22:56.801521 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:56.851286 master-0 kubenswrapper[7479]: I0308 00:22:56.851242 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:22:56.981740 master-0 kubenswrapper[7479]: I0308 00:22:56.981626 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_55216a56-677a-4f28-a530-77d44bded8a2/installer/0.log" Mar 08 00:22:56.981740 master-0 kubenswrapper[7479]: I0308 00:22:56.981682 7479 generic.go:334] "Generic (PLEG): container finished" podID="55216a56-677a-4f28-a530-77d44bded8a2" containerID="1a0afc6f5f43ae0c03dad4b66580da08dbfc175218d88b6ca2b45fa8794895ad" exitCode=1 Mar 08 00:22:56.981958 master-0 kubenswrapper[7479]: I0308 00:22:56.981745 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerDied","Data":"1a0afc6f5f43ae0c03dad4b66580da08dbfc175218d88b6ca2b45fa8794895ad"} Mar 08 00:22:56.983333 master-0 kubenswrapper[7479]: I0308 00:22:56.983298 7479 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="c8de3ced39581b8ad5acd40157b9e893206291d5fd34e7516c2c1b0358ea17a6" exitCode=0 Mar 08 00:22:56.983333 master-0 kubenswrapper[7479]: I0308 00:22:56.983326 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"c8de3ced39581b8ad5acd40157b9e893206291d5fd34e7516c2c1b0358ea17a6"} Mar 08 00:22:56.985851 master-0 kubenswrapper[7479]: I0308 00:22:56.985284 7479 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="b999c6f84ef35141ea9d9157df896d14bb08340f5b7476591f3ed6362f2a6196" exitCode=0 Mar 08 00:22:57.160008 master-0 kubenswrapper[7479]: I0308 00:22:57.159872 7479 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:22:57.920601 master-0 kubenswrapper[7479]: E0308 00:22:57.920357 7479 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:22:58.002757 master-0 kubenswrapper[7479]: I0308 00:22:58.002671 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:22:58.002923 master-0 kubenswrapper[7479]: I0308 00:22:58.002775 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:22:58.314424 master-0 kubenswrapper[7479]: I0308 00:22:58.314361 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_55216a56-677a-4f28-a530-77d44bded8a2/installer/0.log" Mar 08 00:22:58.314424 master-0 kubenswrapper[7479]: I0308 00:22:58.314429 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:58.449485 master-0 kubenswrapper[7479]: I0308 00:22:58.449407 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir\") pod \"55216a56-677a-4f28-a530-77d44bded8a2\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " Mar 08 00:22:58.449669 master-0 kubenswrapper[7479]: I0308 00:22:58.449525 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "55216a56-677a-4f28-a530-77d44bded8a2" (UID: "55216a56-677a-4f28-a530-77d44bded8a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:58.449669 master-0 kubenswrapper[7479]: I0308 00:22:58.449527 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access\") pod \"55216a56-677a-4f28-a530-77d44bded8a2\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " Mar 08 00:22:58.449669 master-0 kubenswrapper[7479]: I0308 00:22:58.449608 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock\") pod \"55216a56-677a-4f28-a530-77d44bded8a2\" (UID: \"55216a56-677a-4f28-a530-77d44bded8a2\") " Mar 08 00:22:58.449863 master-0 kubenswrapper[7479]: I0308 00:22:58.449841 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:58.449900 master-0 kubenswrapper[7479]: I0308 00:22:58.449867 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock" (OuterVolumeSpecName: "var-lock") pod "55216a56-677a-4f28-a530-77d44bded8a2" (UID: "55216a56-677a-4f28-a530-77d44bded8a2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:58.452912 master-0 kubenswrapper[7479]: I0308 00:22:58.452859 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "55216a56-677a-4f28-a530-77d44bded8a2" (UID: "55216a56-677a-4f28-a530-77d44bded8a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:58.551013 master-0 kubenswrapper[7479]: I0308 00:22:58.550903 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55216a56-677a-4f28-a530-77d44bded8a2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:58.551013 master-0 kubenswrapper[7479]: I0308 00:22:58.550941 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55216a56-677a-4f28-a530-77d44bded8a2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:58.997190 master-0 kubenswrapper[7479]: I0308 00:22:58.997117 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 00:22:58.997190 master-0 kubenswrapper[7479]: I0308 00:22:58.997163 7479 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="da60beba23659d143e9020dc0409825d88a4d10b35b445c12b13ae8fc1310bdf" exitCode=137 Mar 08 00:22:58.998564 master-0 kubenswrapper[7479]: I0308 00:22:58.998517 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_55216a56-677a-4f28-a530-77d44bded8a2/installer/0.log" Mar 08 00:22:58.998564 master-0 kubenswrapper[7479]: I0308 00:22:58.998556 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerDied","Data":"2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7"} Mar 08 00:22:58.998797 master-0 kubenswrapper[7479]: I0308 00:22:58.998578 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7" Mar 08 00:22:58.998797 master-0 kubenswrapper[7479]: I0308 00:22:58.998684 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:22:59.520560 master-0 kubenswrapper[7479]: I0308 00:22:59.520416 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 00:22:59.520831 master-0 kubenswrapper[7479]: I0308 00:22:59.520644 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:22:59.664641 master-0 kubenswrapper[7479]: I0308 00:22:59.664568 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 00:22:59.664846 master-0 kubenswrapper[7479]: I0308 00:22:59.664672 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 00:22:59.664846 master-0 kubenswrapper[7479]: I0308 00:22:59.664743 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:59.664955 master-0 kubenswrapper[7479]: I0308 00:22:59.664871 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:22:59.665154 master-0 kubenswrapper[7479]: I0308 00:22:59.665116 7479 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:59.665154 master-0 kubenswrapper[7479]: I0308 00:22:59.665151 7479 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:22:59.895467 master-0 kubenswrapper[7479]: I0308 00:22:59.895366 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 08 00:22:59.895467 master-0 kubenswrapper[7479]: I0308 00:22:59.895873 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:23:00.005124 master-0 kubenswrapper[7479]: I0308 00:23:00.005049 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 00:23:00.005676 master-0 kubenswrapper[7479]: I0308 00:23:00.005243 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: I0308 00:23:00.273416 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:00.273500 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:00.274352 master-0 kubenswrapper[7479]: I0308 00:23:00.273501 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:01.003356 master-0 kubenswrapper[7479]: I0308 00:23:01.003170 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:01.003356 master-0 kubenswrapper[7479]: I0308 00:23:01.003344 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:01.566479 master-0 kubenswrapper[7479]: I0308 00:23:01.566399 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:23:01.566479 master-0 kubenswrapper[7479]: I0308 00:23:01.566465 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:23:03.049264 master-0 kubenswrapper[7479]: E0308 00:23:03.049078 7479 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ab5d976cf15f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:22:28.929525234 +0000 UTC m=+65.242434151,LastTimestamp:2026-03-08 00:22:28.929525234 +0000 UTC m=+65.242434151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:23:03.049264 master-0 kubenswrapper[7479]: I0308 00:23:03.049149 7479 status_manager.go:875] "Failed to update status for pod" pod="openshift-marketplace/redhat-operators-mr22p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f9c188-df80-4606-9a21-72228cffa706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with incomplete status: [extract-content]\\\",\\\"type\\\":\\\"Initialized\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"extract-utilities\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/utilities\\\",\\\"name\\\":\\\"utilities\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t44t4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"extract-content\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/utilities\\\",\\\"name\\\":\\\"utilities\\\"},{\\\"mountPath\\\":\\\"/extracted-catalog\\\",\\\"name\\\":\\\"catalog-content\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t44t4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"10.128.0.54\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"10.128.0.54\\\"}]}}\" for pod \"openshift-marketplace\"/\"redhat-operators-mr22p\": Timeout: request did not complete within requested timeout - context deadline exceeded" Mar 08 00:23:04.002247 master-0 kubenswrapper[7479]: I0308 00:23:04.002130 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:04.002247 master-0 kubenswrapper[7479]: I0308 00:23:04.002251 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:05.346340 master-0 kubenswrapper[7479]: E0308 00:23:05.346263 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:07.002917 master-0 kubenswrapper[7479]: I0308 00:23:07.002818 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:07.002917 master-0 kubenswrapper[7479]: I0308 00:23:07.002902 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:07.160308 master-0 kubenswrapper[7479]: I0308 00:23:07.160233 7479 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:07.921473 master-0 kubenswrapper[7479]: E0308 00:23:07.921384 7479 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:08.045562 master-0 kubenswrapper[7479]: I0308 00:23:08.045507 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/0.log" Mar 08 00:23:08.045562 master-0 kubenswrapper[7479]: I0308 00:23:08.045553 7479 generic.go:334] "Generic (PLEG): container finished" podID="ec2d22f2-c260-42a6-a9da-ee0f44f42303" containerID="06038340b4e3f2befb44d9c767edb4dd565cb0800261ba9f5e36429d3a7bf10d" exitCode=255 Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: I0308 00:23:09.279618 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:09.279719 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:09.280630 master-0 kubenswrapper[7479]: I0308 00:23:09.279722 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:09.988906 master-0 kubenswrapper[7479]: E0308 00:23:09.988830 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 00:23:10.002705 master-0 kubenswrapper[7479]: I0308 00:23:10.002631 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:10.002921 master-0 kubenswrapper[7479]: I0308 00:23:10.002705 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:10.068730 master-0 kubenswrapper[7479]: I0308 00:23:10.068541 7479 generic.go:334] "Generic (PLEG): container finished" podID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerID="4ce369a140420a6c03e974e6eff3c092d5ec9b95e895b002c78c7a3f070c22b2" exitCode=0 Mar 08 00:23:11.074338 master-0 kubenswrapper[7479]: I0308 00:23:11.074290 7479 generic.go:334] "Generic (PLEG): container finished" podID="e76bc134-2a88-4f92-9aa7-f6854941b98f" containerID="ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441" exitCode=0 Mar 08 00:23:11.077087 master-0 kubenswrapper[7479]: I0308 00:23:11.077061 7479 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="620aae0686e0d0747f86c66dccb5f833f425852d851da5976e803bb0ce3011ba" exitCode=0 Mar 08 00:23:11.566550 master-0 kubenswrapper[7479]: I0308 00:23:11.566488 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:23:11.566550 master-0 kubenswrapper[7479]: I0308 00:23:11.566539 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:23:13.002875 master-0 kubenswrapper[7479]: I0308 00:23:13.002667 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:13.002875 master-0 kubenswrapper[7479]: I0308 00:23:13.002730 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:15.100324 master-0 kubenswrapper[7479]: I0308 00:23:15.100191 7479 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" exitCode=0 Mar 08 00:23:15.346526 master-0 kubenswrapper[7479]: E0308 00:23:15.346471 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:16.107625 master-0 kubenswrapper[7479]: I0308 00:23:16.107518 7479 generic.go:334] "Generic (PLEG): container finished" podID="ac523956-c8a3-4794-a1fa-660cd14966bb" containerID="322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03" exitCode=0 Mar 08 00:23:16.109489 master-0 kubenswrapper[7479]: I0308 00:23:16.109389 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/0.log" Mar 08 00:23:16.110056 master-0 kubenswrapper[7479]: I0308 00:23:16.110007 7479 generic.go:334] "Generic (PLEG): container finished" podID="af391724-079a-4bac-a89e-978ffd471763" containerID="c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc" exitCode=1 Mar 08 00:23:17.160640 master-0 kubenswrapper[7479]: I0308 00:23:17.160514 7479 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:17.922077 master-0 kubenswrapper[7479]: E0308 00:23:17.921964 7479 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: I0308 00:23:18.287766 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:18.287847 master-0 kubenswrapper[7479]: I0308 00:23:18.287833 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:19.002515 master-0 kubenswrapper[7479]: I0308 00:23:19.002343 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:19.002515 master-0 kubenswrapper[7479]: I0308 00:23:19.002428 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:19.722722 master-0 kubenswrapper[7479]: I0308 00:23:19.722636 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:19.723495 master-0 kubenswrapper[7479]: I0308 00:23:19.722712 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:21.565806 master-0 kubenswrapper[7479]: I0308 00:23:21.565768 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:23:21.566386 master-0 kubenswrapper[7479]: I0308 00:23:21.566360 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:23:22.002762 master-0 kubenswrapper[7479]: I0308 00:23:22.002666 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:22.003046 master-0 kubenswrapper[7479]: I0308 00:23:22.002755 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:22.153347 master-0 kubenswrapper[7479]: I0308 00:23:22.153252 7479 generic.go:334] "Generic (PLEG): container finished" podID="b100ce12-965e-409e-8cdb-8f99ef51a82b" containerID="5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848" exitCode=0 Mar 08 00:23:22.723442 master-0 kubenswrapper[7479]: I0308 00:23:22.723385 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:22.723966 master-0 kubenswrapper[7479]: I0308 00:23:22.723458 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:25.002341 master-0 kubenswrapper[7479]: I0308 00:23:25.002254 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:25.002873 master-0 kubenswrapper[7479]: I0308 00:23:25.002352 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:25.347604 master-0 kubenswrapper[7479]: E0308 00:23:25.347502 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:25.347604 master-0 kubenswrapper[7479]: E0308 00:23:25.347579 7479 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:23:25.722936 master-0 kubenswrapper[7479]: I0308 00:23:25.722819 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:25.722936 master-0 kubenswrapper[7479]: I0308 00:23:25.722875 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:26.173845 master-0 kubenswrapper[7479]: I0308 00:23:26.173801 7479 generic.go:334] "Generic (PLEG): container finished" podID="c2ce2ea7-bd25-4294-8f3a-11ce53577830" containerID="8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4" exitCode=0 Mar 08 00:23:26.175446 master-0 kubenswrapper[7479]: I0308 00:23:26.175424 7479 generic.go:334] "Generic (PLEG): container finished" podID="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" containerID="08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c" exitCode=0 Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: I0308 00:23:27.295274 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:27.295332 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:27.296181 master-0 kubenswrapper[7479]: I0308 00:23:27.295335 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:27.922720 master-0 kubenswrapper[7479]: E0308 00:23:27.922632 7479 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:27.923305 master-0 kubenswrapper[7479]: I0308 00:23:27.923018 7479 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 00:23:28.003087 master-0 kubenswrapper[7479]: I0308 00:23:28.003025 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:28.003324 master-0 kubenswrapper[7479]: I0308 00:23:28.003105 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:30.197915 master-0 kubenswrapper[7479]: I0308 00:23:30.197851 7479 generic.go:334] "Generic (PLEG): container finished" podID="58333089-2456-4a25-8ba7-6d557eefa177" containerID="00aa20318a390dc28a1b90d9dfa760b9b264408ce2a090ec0af81099188274b0" exitCode=0 Mar 08 00:23:31.002400 master-0 kubenswrapper[7479]: I0308 00:23:31.002347 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:31.002757 master-0 kubenswrapper[7479]: I0308 00:23:31.002705 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:31.206290 master-0 kubenswrapper[7479]: I0308 00:23:31.206232 7479 generic.go:334] "Generic (PLEG): container finished" podID="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" containerID="459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00" exitCode=0 Mar 08 00:23:31.238535 master-0 kubenswrapper[7479]: I0308 00:23:31.238313 7479 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-27phk container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 08 00:23:31.238535 master-0 kubenswrapper[7479]: I0308 00:23:31.238394 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" podUID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 08 00:23:33.898560 master-0 kubenswrapper[7479]: E0308 00:23:33.898498 7479 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:23:33.899167 master-0 kubenswrapper[7479]: E0308 00:23:33.898706 7479 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.015s" Mar 08 00:23:33.899167 master-0 kubenswrapper[7479]: I0308 00:23:33.898778 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerDied","Data":"06038340b4e3f2befb44d9c767edb4dd565cb0800261ba9f5e36429d3a7bf10d"} Mar 08 00:23:33.899167 master-0 kubenswrapper[7479]: I0308 00:23:33.898798 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:23:33.899167 master-0 kubenswrapper[7479]: I0308 00:23:33.898808 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerDied","Data":"4ce369a140420a6c03e974e6eff3c092d5ec9b95e895b002c78c7a3f070c22b2"} Mar 08 00:23:33.899167 master-0 kubenswrapper[7479]: I0308 00:23:33.898997 7479 scope.go:117] "RemoveContainer" containerID="b999c6f84ef35141ea9d9157df896d14bb08340f5b7476591f3ed6362f2a6196" Mar 08 00:23:33.899567 master-0 kubenswrapper[7479]: I0308 00:23:33.899514 7479 scope.go:117] "RemoveContainer" containerID="4ce369a140420a6c03e974e6eff3c092d5ec9b95e895b002c78c7a3f070c22b2" Mar 08 00:23:33.902130 master-0 kubenswrapper[7479]: I0308 00:23:33.902079 7479 scope.go:117] "RemoveContainer" containerID="06038340b4e3f2befb44d9c767edb4dd565cb0800261ba9f5e36429d3a7bf10d" Mar 08 00:23:33.902779 master-0 kubenswrapper[7479]: I0308 00:23:33.902744 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 08 00:23:33.902825 master-0 kubenswrapper[7479]: I0308 00:23:33.902802 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e" gracePeriod=30 Mar 08 00:23:33.906227 master-0 kubenswrapper[7479]: I0308 00:23:33.906177 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:23:33.949430 master-0 kubenswrapper[7479]: I0308 00:23:33.949379 7479 scope.go:117] "RemoveContainer" containerID="da60beba23659d143e9020dc0409825d88a4d10b35b445c12b13ae8fc1310bdf" Mar 08 00:23:34.002338 master-0 kubenswrapper[7479]: I0308 00:23:34.002295 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:34.002541 master-0 kubenswrapper[7479]: I0308 00:23:34.002352 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:34.235870 master-0 kubenswrapper[7479]: I0308 00:23:34.235832 7479 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e" exitCode=2 Mar 08 00:23:34.237485 master-0 kubenswrapper[7479]: I0308 00:23:34.237448 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/0.log" Mar 08 00:23:35.696368 master-0 kubenswrapper[7479]: E0308 00:23:35.696315 7479 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 00:23:35.696368 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf" Netns:"/var/run/netns/e8993663-ed4b-4910-bff9-50187d15a2a8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": context deadline exceeded Mar 08 00:23:35.696368 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:23:35.696368 master-0 kubenswrapper[7479]: > Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: E0308 00:23:35.696405 7479 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf" Netns:"/var/run/netns/e8993663-ed4b-4910-bff9-50187d15a2a8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": context deadline exceeded Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: > pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: E0308 00:23:35.696429 7479 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf" Netns:"/var/run/netns/e8993663-ed4b-4910-bff9-50187d15a2a8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": context deadline exceeded Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: > pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:23:35.696805 master-0 kubenswrapper[7479]: E0308 00:23:35.696502 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-5-master-0_openshift-kube-scheduler(21dd42b1-2628-4a24-97e7-6759888ed316)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-5-master-0_openshift-kube-scheduler(21dd42b1-2628-4a24-97e7-6759888ed316)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf\\\" Netns:\\\"/var/run/netns/e8993663-ed4b-4910-bff9-50187d15a2a8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=1b24d26e9924406ab705c5b22ab8aabe5652dc45b1686bf53f21c2d4d1ba3adf;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s\\\": context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-scheduler/installer-5-master-0" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" Mar 08 00:23:36.250154 master-0 kubenswrapper[7479]: I0308 00:23:36.250094 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:23:36.250536 master-0 kubenswrapper[7479]: I0308 00:23:36.250511 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: I0308 00:23:36.300361 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:36.300415 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:36.301014 master-0 kubenswrapper[7479]: I0308 00:23:36.300433 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:37.002095 master-0 kubenswrapper[7479]: I0308 00:23:37.002033 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:37.002621 master-0 kubenswrapper[7479]: I0308 00:23:37.002126 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:37.051678 master-0 kubenswrapper[7479]: E0308 00:23:37.051543 7479 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{community-operators-ms5vp.189ab5db9f5b9bb2 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-ms5vp,UID:668ffbde-4771-43e1-8f0e-d4b5d17ff693,APIVersion:v1,ResourceVersion:7236,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/community-operator-index:v4.18\" in 23.702s (23.702s including waiting). Image size: 1220167376 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:22:38.199757746 +0000 UTC m=+74.512666663,LastTimestamp:2026-03-08 00:22:38.199757746 +0000 UTC m=+74.512666663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:23:37.924971 master-0 kubenswrapper[7479]: E0308 00:23:37.924875 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 08 00:23:40.002700 master-0 kubenswrapper[7479]: I0308 00:23:40.002557 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:40.003757 master-0 kubenswrapper[7479]: I0308 00:23:40.002708 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:43.002423 master-0 kubenswrapper[7479]: I0308 00:23:43.002278 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:43.002423 master-0 kubenswrapper[7479]: I0308 00:23:43.002399 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: I0308 00:23:45.309063 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:45.309232 master-0 kubenswrapper[7479]: I0308 00:23:45.309140 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:45.725086 master-0 kubenswrapper[7479]: E0308 00:23:45.724752 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:23:35Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:23:35Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:23:35Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:23:35Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:46.002887 master-0 kubenswrapper[7479]: I0308 00:23:46.002811 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:46.003116 master-0 kubenswrapper[7479]: I0308 00:23:46.002905 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:48.127000 master-0 kubenswrapper[7479]: E0308 00:23:48.126960 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="400ms" Mar 08 00:23:49.003562 master-0 kubenswrapper[7479]: I0308 00:23:49.003456 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:49.003830 master-0 kubenswrapper[7479]: I0308 00:23:49.003604 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:52.003070 master-0 kubenswrapper[7479]: I0308 00:23:52.002977 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:52.004094 master-0 kubenswrapper[7479]: I0308 00:23:52.003075 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: I0308 00:23:54.316844 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:23:54.316951 master-0 kubenswrapper[7479]: I0308 00:23:54.316926 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:23:55.002470 master-0 kubenswrapper[7479]: I0308 00:23:55.002295 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:55.002470 master-0 kubenswrapper[7479]: I0308 00:23:55.002400 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:55.725361 master-0 kubenswrapper[7479]: E0308 00:23:55.725291 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:23:58.002340 master-0 kubenswrapper[7479]: I0308 00:23:58.002179 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:23:58.002340 master-0 kubenswrapper[7479]: I0308 00:23:58.002287 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:23:58.529044 master-0 kubenswrapper[7479]: E0308 00:23:58.528949 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 08 00:24:01.002288 master-0 kubenswrapper[7479]: I0308 00:24:01.002087 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:01.002288 master-0 kubenswrapper[7479]: I0308 00:24:01.002158 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:03.050852 master-0 kubenswrapper[7479]: I0308 00:24:03.050761 7479 status_manager.go:851] "Failed to get status for pod" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" pod="openshift-kube-controller-manager/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: I0308 00:24:03.322390 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:03.322637 master-0 kubenswrapper[7479]: I0308 00:24:03.322467 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:04.002442 master-0 kubenswrapper[7479]: I0308 00:24:04.002341 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:04.002442 master-0 kubenswrapper[7479]: I0308 00:24:04.002417 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:05.725576 master-0 kubenswrapper[7479]: E0308 00:24:05.725513 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 08 00:24:07.002502 master-0 kubenswrapper[7479]: I0308 00:24:07.002436 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:07.002990 master-0 kubenswrapper[7479]: I0308 00:24:07.002509 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:07.908620 master-0 kubenswrapper[7479]: E0308 00:24:07.908548 7479 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:24:07.908865 master-0 kubenswrapper[7479]: E0308 00:24:07.908802 7479 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.01s" Mar 08 00:24:07.919591 master-0 kubenswrapper[7479]: I0308 00:24:07.919544 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:24:09.330566 master-0 kubenswrapper[7479]: E0308 00:24:09.330462 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 08 00:24:10.002611 master-0 kubenswrapper[7479]: I0308 00:24:10.002516 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:10.002866 master-0 kubenswrapper[7479]: I0308 00:24:10.002614 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:10.442461 master-0 kubenswrapper[7479]: I0308 00:24:10.442280 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/1.log" Mar 08 00:24:10.443294 master-0 kubenswrapper[7479]: I0308 00:24:10.443249 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/0.log" Mar 08 00:24:10.443342 master-0 kubenswrapper[7479]: I0308 00:24:10.443314 7479 generic.go:334] "Generic (PLEG): container finished" podID="ef0a3c84-98bb-4915-9010-d66fcbeafe09" containerID="5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569" exitCode=255 Mar 08 00:24:11.054623 master-0 kubenswrapper[7479]: E0308 00:24:11.054471 7479 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-marketplace-4r9ht.189ab5db9f6f8c47 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-4r9ht,UID:6c644b9b-a551-48d2-8f16-e1a6da7d98c9,APIVersion:v1,ResourceVersion:7395,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 22.688s (22.688s including waiting). Image size: 1229556414 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:22:38.201064519 +0000 UTC m=+74.513973436,LastTimestamp:2026-03-08 00:22:38.201064519 +0000 UTC m=+74.513973436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: I0308 00:24:12.328356 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:12.328419 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:12.329549 master-0 kubenswrapper[7479]: I0308 00:24:12.328425 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:13.002887 master-0 kubenswrapper[7479]: I0308 00:24:13.002529 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:13.002887 master-0 kubenswrapper[7479]: I0308 00:24:13.002610 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:15.726265 master-0 kubenswrapper[7479]: E0308 00:24:15.726117 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:16.483105 master-0 kubenswrapper[7479]: I0308 00:24:16.483039 7479 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerID="04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3" exitCode=0 Mar 08 00:24:16.974581 master-0 kubenswrapper[7479]: I0308 00:24:16.974415 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:16.974581 master-0 kubenswrapper[7479]: I0308 00:24:16.974506 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:16.975315 master-0 kubenswrapper[7479]: I0308 00:24:16.975274 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:16.975395 master-0 kubenswrapper[7479]: I0308 00:24:16.975337 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:17.003483 master-0 kubenswrapper[7479]: I0308 00:24:17.003427 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:17.003685 master-0 kubenswrapper[7479]: I0308 00:24:17.003506 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:20.003652 master-0 kubenswrapper[7479]: I0308 00:24:20.003533 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:20.003652 master-0 kubenswrapper[7479]: I0308 00:24:20.003624 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:20.932647 master-0 kubenswrapper[7479]: E0308 00:24:20.932539 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: I0308 00:24:21.334064 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:21.334192 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:21.336083 master-0 kubenswrapper[7479]: I0308 00:24:21.334257 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:23.002685 master-0 kubenswrapper[7479]: I0308 00:24:23.002595 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:23.003164 master-0 kubenswrapper[7479]: I0308 00:24:23.002715 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:25.542259 master-0 kubenswrapper[7479]: I0308 00:24:25.542194 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/0.log" Mar 08 00:24:25.542887 master-0 kubenswrapper[7479]: I0308 00:24:25.542268 7479 generic.go:334] "Generic (PLEG): container finished" podID="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" containerID="1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869" exitCode=1 Mar 08 00:24:25.726972 master-0 kubenswrapper[7479]: E0308 00:24:25.726894 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:25.726972 master-0 kubenswrapper[7479]: E0308 00:24:25.726951 7479 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:24:26.002523 master-0 kubenswrapper[7479]: I0308 00:24:26.002443 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:26.002713 master-0 kubenswrapper[7479]: I0308 00:24:26.002545 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:26.548584 master-0 kubenswrapper[7479]: I0308 00:24:26.548541 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/0.log" Mar 08 00:24:26.549145 master-0 kubenswrapper[7479]: I0308 00:24:26.548947 7479 generic.go:334] "Generic (PLEG): container finished" podID="d01c21a1-6c2c-49a7-9d85-254662851838" containerID="f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7" exitCode=1 Mar 08 00:24:26.974526 master-0 kubenswrapper[7479]: I0308 00:24:26.974373 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:26.974526 master-0 kubenswrapper[7479]: I0308 00:24:26.974446 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:26.974526 master-0 kubenswrapper[7479]: I0308 00:24:26.974466 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:26.974526 master-0 kubenswrapper[7479]: I0308 00:24:26.974496 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:27.555045 master-0 kubenswrapper[7479]: I0308 00:24:27.554991 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/0.log" Mar 08 00:24:27.555045 master-0 kubenswrapper[7479]: I0308 00:24:27.555030 7479 generic.go:334] "Generic (PLEG): container finished" podID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" containerID="01f4711968edd90a03ce566521bccad3babf877143c30f69324972ce8a8bc2ae" exitCode=1 Mar 08 00:24:29.002533 master-0 kubenswrapper[7479]: I0308 00:24:29.002436 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:29.002533 master-0 kubenswrapper[7479]: I0308 00:24:29.002532 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: I0308 00:24:30.340939 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:30.340986 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:30.341835 master-0 kubenswrapper[7479]: I0308 00:24:30.341000 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:32.003365 master-0 kubenswrapper[7479]: I0308 00:24:32.003288 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:32.003830 master-0 kubenswrapper[7479]: I0308 00:24:32.003365 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:33.440351 master-0 kubenswrapper[7479]: I0308 00:24:33.440257 7479 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-7nhvs container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.44:8081/readyz\": dial tcp 10.128.0.44:8081: connect: connection refused" start-of-body= Mar 08 00:24:33.441177 master-0 kubenswrapper[7479]: I0308 00:24:33.440366 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" podUID="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.44:8081/readyz\": dial tcp 10.128.0.44:8081: connect: connection refused" Mar 08 00:24:33.763811 master-0 kubenswrapper[7479]: I0308 00:24:33.763703 7479 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-w2q2q container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.45:8081/readyz\": dial tcp 10.128.0.45:8081: connect: connection refused" start-of-body= Mar 08 00:24:33.764114 master-0 kubenswrapper[7479]: I0308 00:24:33.763810 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" podUID="d01c21a1-6c2c-49a7-9d85-254662851838" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.45:8081/readyz\": dial tcp 10.128.0.45:8081: connect: connection refused" Mar 08 00:24:34.134454 master-0 kubenswrapper[7479]: E0308 00:24:34.134346 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 08 00:24:35.002832 master-0 kubenswrapper[7479]: I0308 00:24:35.002624 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:35.003326 master-0 kubenswrapper[7479]: I0308 00:24:35.002850 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:35.607933 master-0 kubenswrapper[7479]: I0308 00:24:35.607803 7479 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" exitCode=1 Mar 08 00:24:36.895139 master-0 kubenswrapper[7479]: E0308 00:24:36.895063 7479 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 00:24:36.895139 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224" Netns:"/var/run/netns/d5c17055-bd7a-44c3-91f8-32894633452e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 00:24:36.895139 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:24:36.895139 master-0 kubenswrapper[7479]: > Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: E0308 00:24:36.895168 7479 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224" Netns:"/var/run/netns/d5c17055-bd7a-44c3-91f8-32894633452e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: > pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: E0308 00:24:36.895219 7479 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224" Netns:"/var/run/netns/d5c17055-bd7a-44c3-91f8-32894633452e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: > pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:24:36.895771 master-0 kubenswrapper[7479]: E0308 00:24:36.895314 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-5-master-0_openshift-kube-scheduler(21dd42b1-2628-4a24-97e7-6759888ed316)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-5-master-0_openshift-kube-scheduler(21dd42b1-2628-4a24-97e7-6759888ed316)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-5-master-0_openshift-kube-scheduler_21dd42b1-2628-4a24-97e7-6759888ed316_0(a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224): error adding pod openshift-kube-scheduler_installer-5-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224\\\" Netns:\\\"/var/run/netns/d5c17055-bd7a-44c3-91f8-32894633452e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-5-master-0;K8S_POD_INFRA_CONTAINER_ID=a17c4d8c7eb07aa5bdf2596382750aacc385edeceaae39266656d3bbbb603224;K8S_POD_UID=21dd42b1-2628-4a24-97e7-6759888ed316\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-scheduler/installer-5-master-0] networking: Multus: [openshift-kube-scheduler/installer-5-master-0/21dd42b1-2628-4a24-97e7-6759888ed316]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-5-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-5-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-5-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-scheduler/installer-5-master-0" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" Mar 08 00:24:36.974431 master-0 kubenswrapper[7479]: I0308 00:24:36.974332 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:36.974431 master-0 kubenswrapper[7479]: I0308 00:24:36.974395 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:36.974954 master-0 kubenswrapper[7479]: I0308 00:24:36.974547 7479 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-mgb5v container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" start-of-body= Mar 08 00:24:36.974954 master-0 kubenswrapper[7479]: I0308 00:24:36.974596 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" podUID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.13:8080/healthz\": dial tcp 10.128.0.13:8080: connect: connection refused" Mar 08 00:24:38.002539 master-0 kubenswrapper[7479]: I0308 00:24:38.002484 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:38.003158 master-0 kubenswrapper[7479]: I0308 00:24:38.002550 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:38.621484 master-0 kubenswrapper[7479]: I0308 00:24:38.621367 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/0.log" Mar 08 00:24:38.621484 master-0 kubenswrapper[7479]: I0308 00:24:38.621411 7479 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae" exitCode=1 Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: I0308 00:24:39.346071 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:39.346185 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:39.348283 master-0 kubenswrapper[7479]: I0308 00:24:39.346182 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:41.002776 master-0 kubenswrapper[7479]: I0308 00:24:41.002681 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:24:41.002776 master-0 kubenswrapper[7479]: I0308 00:24:41.002757 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:41.921899 master-0 kubenswrapper[7479]: E0308 00:24:41.921829 7479 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:24:41.922100 master-0 kubenswrapper[7479]: E0308 00:24:41.921984 7479 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.013s" Mar 08 00:24:41.922771 master-0 kubenswrapper[7479]: I0308 00:24:41.922736 7479 scope.go:117] "RemoveContainer" containerID="ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441" Mar 08 00:24:41.922844 master-0 kubenswrapper[7479]: I0308 00:24:41.922793 7479 scope.go:117] "RemoveContainer" containerID="08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c" Mar 08 00:24:41.922907 master-0 kubenswrapper[7479]: I0308 00:24:41.922793 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:24:41.922907 master-0 kubenswrapper[7479]: I0308 00:24:41.922878 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:24:41.923002 master-0 kubenswrapper[7479]: I0308 00:24:41.922894 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerDied","Data":"ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441"} Mar 08 00:24:41.923002 master-0 kubenswrapper[7479]: I0308 00:24:41.922977 7479 scope.go:117] "RemoveContainer" containerID="322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923449 7479 scope.go:117] "RemoveContainer" containerID="5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923466 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923520 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" containerID="cri-o://3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" gracePeriod=30 Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923484 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923615 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923630 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:24:41.923932 master-0 kubenswrapper[7479]: I0308 00:24:41.923644 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:24:41.924466 master-0 kubenswrapper[7479]: I0308 00:24:41.924059 7479 scope.go:117] "RemoveContainer" containerID="04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3" Mar 08 00:24:41.928169 master-0 kubenswrapper[7479]: I0308 00:24:41.928126 7479 scope.go:117] "RemoveContainer" containerID="8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4" Mar 08 00:24:41.928262 master-0 kubenswrapper[7479]: I0308 00:24:41.928174 7479 scope.go:117] "RemoveContainer" containerID="8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae" Mar 08 00:24:41.928486 master-0 kubenswrapper[7479]: I0308 00:24:41.928449 7479 scope.go:117] "RemoveContainer" containerID="1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869" Mar 08 00:24:41.928822 master-0 kubenswrapper[7479]: I0308 00:24:41.928777 7479 scope.go:117] "RemoveContainer" containerID="5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848" Mar 08 00:24:41.928879 master-0 kubenswrapper[7479]: I0308 00:24:41.928821 7479 scope.go:117] "RemoveContainer" containerID="00aa20318a390dc28a1b90d9dfa760b9b264408ce2a090ec0af81099188274b0" Mar 08 00:24:41.929264 master-0 kubenswrapper[7479]: I0308 00:24:41.929225 7479 scope.go:117] "RemoveContainer" containerID="459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00" Mar 08 00:24:41.929335 master-0 kubenswrapper[7479]: I0308 00:24:41.929263 7479 scope.go:117] "RemoveContainer" containerID="f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7" Mar 08 00:24:41.930144 master-0 kubenswrapper[7479]: I0308 00:24:41.930104 7479 scope.go:117] "RemoveContainer" containerID="c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc" Mar 08 00:24:41.936828 master-0 kubenswrapper[7479]: I0308 00:24:41.936561 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:24:41.941277 master-0 kubenswrapper[7479]: I0308 00:24:41.940956 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": read tcp 10.128.0.2:50520->10.128.0.26:8443: read: connection reset by peer" start-of-body= Mar 08 00:24:41.941277 master-0 kubenswrapper[7479]: I0308 00:24:41.941003 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": read tcp 10.128.0.2:50520->10.128.0.26:8443: read: connection reset by peer" Mar 08 00:24:42.643223 master-0 kubenswrapper[7479]: I0308 00:24:42.643170 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/0.log" Mar 08 00:24:42.644750 master-0 kubenswrapper[7479]: I0308 00:24:42.644729 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/0.log" Mar 08 00:24:42.647164 master-0 kubenswrapper[7479]: I0308 00:24:42.647144 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/1.log" Mar 08 00:24:42.647735 master-0 kubenswrapper[7479]: I0308 00:24:42.647704 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/0.log" Mar 08 00:24:42.651136 master-0 kubenswrapper[7479]: I0308 00:24:42.651106 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/0.log" Mar 08 00:24:42.658090 master-0 kubenswrapper[7479]: I0308 00:24:42.658058 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/0.log" Mar 08 00:24:42.664810 master-0 kubenswrapper[7479]: I0308 00:24:42.664786 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vnl28_2b1a69b5-c946-495d-ae02-c56f788279e8/openshift-config-operator/1.log" Mar 08 00:24:42.665400 master-0 kubenswrapper[7479]: I0308 00:24:42.665374 7479 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" exitCode=255 Mar 08 00:24:45.057698 master-0 kubenswrapper[7479]: E0308 00:24:45.057526 7479 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{certified-operators-lqc4n.189ab5dba360596e openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-lqc4n,UID:8b94e1ca-5aef-49ae-928e-29cc0ce81d61,APIVersion:v1,ResourceVersion:7235,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/certified-operator-index:v4.18\" in 23.779s (23.779s including waiting). Image size: 1272201949 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:22:38.267177326 +0000 UTC m=+74.580086243,LastTimestamp:2026-03-08 00:22:38.267177326 +0000 UTC m=+74.580086243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:24:45.834463 master-0 kubenswrapper[7479]: E0308 00:24:45.834163 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:24:35Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:24:35Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:24:35Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:24:35Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:46.002834 master-0 kubenswrapper[7479]: I0308 00:24:46.002741 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:46.003147 master-0 kubenswrapper[7479]: I0308 00:24:46.002873 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:46.723279 master-0 kubenswrapper[7479]: I0308 00:24:46.723177 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:46.724363 master-0 kubenswrapper[7479]: I0308 00:24:46.723292 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: I0308 00:24:48.354489 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:48.354586 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:48.356071 master-0 kubenswrapper[7479]: I0308 00:24:48.354607 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:49.002981 master-0 kubenswrapper[7479]: I0308 00:24:49.002892 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:49.002981 master-0 kubenswrapper[7479]: I0308 00:24:49.002959 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:49.722765 master-0 kubenswrapper[7479]: I0308 00:24:49.722685 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:49.723373 master-0 kubenswrapper[7479]: I0308 00:24:49.722768 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:50.536359 master-0 kubenswrapper[7479]: E0308 00:24:50.536290 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:24:52.001941 master-0 kubenswrapper[7479]: I0308 00:24:52.001879 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:52.002545 master-0 kubenswrapper[7479]: I0308 00:24:52.001942 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:52.722365 master-0 kubenswrapper[7479]: I0308 00:24:52.722281 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:52.722666 master-0 kubenswrapper[7479]: I0308 00:24:52.722354 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:54.738648 master-0 kubenswrapper[7479]: I0308 00:24:54.738537 7479 generic.go:334] "Generic (PLEG): container finished" podID="3fee96d7-75a7-46e4-9707-7bd292f10b84" containerID="52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379" exitCode=0 Mar 08 00:24:54.937082 master-0 kubenswrapper[7479]: E0308 00:24:54.937011 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 00:24:55.002130 master-0 kubenswrapper[7479]: I0308 00:24:55.002073 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:55.002332 master-0 kubenswrapper[7479]: I0308 00:24:55.002133 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:24:55.835363 master-0 kubenswrapper[7479]: E0308 00:24:55.835258 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: I0308 00:24:57.361252 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:24:57.361341 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:24:57.362560 master-0 kubenswrapper[7479]: I0308 00:24:57.361356 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:24:58.003032 master-0 kubenswrapper[7479]: I0308 00:24:58.002970 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:24:58.003032 master-0 kubenswrapper[7479]: I0308 00:24:58.003029 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:01.002156 master-0 kubenswrapper[7479]: I0308 00:25:01.002078 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:01.002156 master-0 kubenswrapper[7479]: I0308 00:25:01.002150 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:01.800648 master-0 kubenswrapper[7479]: I0308 00:25:01.800587 7479 generic.go:334] "Generic (PLEG): container finished" podID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerID="92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807" exitCode=0 Mar 08 00:25:03.053005 master-0 kubenswrapper[7479]: I0308 00:25:03.052924 7479 status_manager.go:851] "Failed to get status for pod" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" pod="openshift-etcd/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Mar 08 00:25:03.145801 master-0 kubenswrapper[7479]: I0308 00:25:03.145740 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:03.145801 master-0 kubenswrapper[7479]: I0308 00:25:03.145784 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:03.146172 master-0 kubenswrapper[7479]: I0308 00:25:03.145801 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:03.146172 master-0 kubenswrapper[7479]: I0308 00:25:03.145850 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:04.002773 master-0 kubenswrapper[7479]: I0308 00:25:04.002689 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:04.003037 master-0 kubenswrapper[7479]: I0308 00:25:04.002794 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:04.816320 master-0 kubenswrapper[7479]: I0308 00:25:04.816245 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-27phk_2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/etcd-operator/1.log" Mar 08 00:25:04.817333 master-0 kubenswrapper[7479]: I0308 00:25:04.817263 7479 generic.go:334] "Generic (PLEG): container finished" podID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerID="d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922" exitCode=255 Mar 08 00:25:04.819665 master-0 kubenswrapper[7479]: I0308 00:25:04.819629 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/1.log" Mar 08 00:25:04.820167 master-0 kubenswrapper[7479]: I0308 00:25:04.820146 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/0.log" Mar 08 00:25:04.820252 master-0 kubenswrapper[7479]: I0308 00:25:04.820183 7479 generic.go:334] "Generic (PLEG): container finished" podID="ec2d22f2-c260-42a6-a9da-ee0f44f42303" containerID="25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede" exitCode=255 Mar 08 00:25:05.836432 master-0 kubenswrapper[7479]: E0308 00:25:05.836367 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: I0308 00:25:06.370737 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:25:06.370785 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:25:06.372303 master-0 kubenswrapper[7479]: I0308 00:25:06.370793 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:25:07.002320 master-0 kubenswrapper[7479]: I0308 00:25:07.002281 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:07.002821 master-0 kubenswrapper[7479]: I0308 00:25:07.002794 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:07.538253 master-0 kubenswrapper[7479]: E0308 00:25:07.538092 7479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:25:10.002366 master-0 kubenswrapper[7479]: I0308 00:25:10.002295 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:10.002366 master-0 kubenswrapper[7479]: I0308 00:25:10.002359 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:12.865561 master-0 kubenswrapper[7479]: I0308 00:25:12.865489 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/1.log" Mar 08 00:25:12.867078 master-0 kubenswrapper[7479]: I0308 00:25:12.867021 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/0.log" Mar 08 00:25:12.867233 master-0 kubenswrapper[7479]: I0308 00:25:12.867092 7479 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266" exitCode=1 Mar 08 00:25:13.002658 master-0 kubenswrapper[7479]: I0308 00:25:13.002507 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:13.003044 master-0 kubenswrapper[7479]: I0308 00:25:13.002685 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:13.145895 master-0 kubenswrapper[7479]: I0308 00:25:13.145707 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:13.145895 master-0 kubenswrapper[7479]: I0308 00:25:13.145772 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:13.146324 master-0 kubenswrapper[7479]: I0308 00:25:13.146157 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:13.146324 master-0 kubenswrapper[7479]: I0308 00:25:13.146232 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: I0308 00:25:15.375186 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:25:15.375320 master-0 kubenswrapper[7479]: I0308 00:25:15.375288 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:25:15.837416 master-0 kubenswrapper[7479]: E0308 00:25:15.837360 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:25:15.938621 master-0 kubenswrapper[7479]: E0308 00:25:15.938559 7479 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 00:25:15.938882 master-0 kubenswrapper[7479]: E0308 00:25:15.938714 7479 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.015s" Mar 08 00:25:15.938882 master-0 kubenswrapper[7479]: I0308 00:25:15.938768 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"620aae0686e0d0747f86c66dccb5f833f425852d851da5976e803bb0ce3011ba"} Mar 08 00:25:15.938882 master-0 kubenswrapper[7479]: I0308 00:25:15.938846 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:25:15.938882 master-0 kubenswrapper[7479]: I0308 00:25:15.938859 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:25:15.939367 master-0 kubenswrapper[7479]: I0308 00:25:15.939266 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:25:15.939555 master-0 kubenswrapper[7479]: I0308 00:25:15.939526 7479 scope.go:117] "RemoveContainer" containerID="25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede" Mar 08 00:25:15.939948 master-0 kubenswrapper[7479]: I0308 00:25:15.939906 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:25:15.940897 master-0 kubenswrapper[7479]: I0308 00:25:15.940366 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 08 00:25:15.940897 master-0 kubenswrapper[7479]: I0308 00:25:15.940399 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" containerID="cri-o://4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73" gracePeriod=30 Mar 08 00:25:15.940897 master-0 kubenswrapper[7479]: I0308 00:25:15.940638 7479 scope.go:117] "RemoveContainer" containerID="01f4711968edd90a03ce566521bccad3babf877143c30f69324972ce8a8bc2ae" Mar 08 00:25:15.941324 master-0 kubenswrapper[7479]: I0308 00:25:15.941293 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:15.941441 master-0 kubenswrapper[7479]: I0308 00:25:15.941414 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:15.941771 master-0 kubenswrapper[7479]: I0308 00:25:15.941501 7479 scope.go:117] "RemoveContainer" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:25:15.948493 master-0 kubenswrapper[7479]: I0308 00:25:15.948423 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:25:16.298677 master-0 kubenswrapper[7479]: I0308 00:25:16.298588 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": read tcp 10.128.0.2:51650->10.128.0.26:8443: read: connection reset by peer" start-of-body= Mar 08 00:25:16.298677 master-0 kubenswrapper[7479]: I0308 00:25:16.298650 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": read tcp 10.128.0.2:51650->10.128.0.26:8443: read: connection reset by peer" Mar 08 00:25:16.893276 master-0 kubenswrapper[7479]: I0308 00:25:16.893241 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/1.log" Mar 08 00:25:16.894293 master-0 kubenswrapper[7479]: I0308 00:25:16.894275 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/0.log" Mar 08 00:25:16.896517 master-0 kubenswrapper[7479]: I0308 00:25:16.896488 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/0.log" Mar 08 00:25:16.898239 master-0 kubenswrapper[7479]: I0308 00:25:16.898220 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vnl28_2b1a69b5-c946-495d-ae02-c56f788279e8/openshift-config-operator/2.log" Mar 08 00:25:16.898625 master-0 kubenswrapper[7479]: I0308 00:25:16.898600 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vnl28_2b1a69b5-c946-495d-ae02-c56f788279e8/openshift-config-operator/1.log" Mar 08 00:25:16.899391 master-0 kubenswrapper[7479]: I0308 00:25:16.899347 7479 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73" exitCode=255 Mar 08 00:25:19.002914 master-0 kubenswrapper[7479]: I0308 00:25:19.002817 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:19.004245 master-0 kubenswrapper[7479]: I0308 00:25:19.002894 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:19.061466 master-0 kubenswrapper[7479]: E0308 00:25:19.061054 7479 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-operators-mr22p.189ab5dba5b22387 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-mr22p,UID:07f9c188-df80-4606-9a21-72228cffa706,APIVersion:v1,ResourceVersion:7455,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-operator-index:v4.18\" in 11.52s (11.52s including waiting). Image size: 1733328350 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:22:38.306091911 +0000 UTC m=+74.619000828,LastTimestamp:2026-03-08 00:22:38.306091911 +0000 UTC m=+74.619000828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:25:21.567492 master-0 kubenswrapper[7479]: I0308 00:25:21.567438 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:25:21.568000 master-0 kubenswrapper[7479]: I0308 00:25:21.567493 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:25:22.002015 master-0 kubenswrapper[7479]: I0308 00:25:22.001922 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:22.002327 master-0 kubenswrapper[7479]: I0308 00:25:22.002031 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:23.146493 master-0 kubenswrapper[7479]: I0308 00:25:23.146450 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:23.147096 master-0 kubenswrapper[7479]: I0308 00:25:23.146505 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:23.147267 master-0 kubenswrapper[7479]: I0308 00:25:23.147236 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:23.147468 master-0 kubenswrapper[7479]: I0308 00:25:23.147407 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: I0308 00:25:24.384474 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:25:24.384535 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:25:24.386344 master-0 kubenswrapper[7479]: I0308 00:25:24.385338 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:25:25.002555 master-0 kubenswrapper[7479]: I0308 00:25:25.002438 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:25.002555 master-0 kubenswrapper[7479]: I0308 00:25:25.002518 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:25.838013 master-0 kubenswrapper[7479]: E0308 00:25:25.837884 7479 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:25:25.838013 master-0 kubenswrapper[7479]: E0308 00:25:25.837928 7479 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:25:28.002948 master-0 kubenswrapper[7479]: I0308 00:25:28.002862 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:28.002948 master-0 kubenswrapper[7479]: I0308 00:25:28.002923 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:31.003024 master-0 kubenswrapper[7479]: I0308 00:25:31.002972 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:31.003578 master-0 kubenswrapper[7479]: I0308 00:25:31.003044 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:31.237339 master-0 kubenswrapper[7479]: I0308 00:25:31.237278 7479 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-27phk container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 08 00:25:31.237546 master-0 kubenswrapper[7479]: I0308 00:25:31.237358 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" podUID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 08 00:25:31.566506 master-0 kubenswrapper[7479]: I0308 00:25:31.566379 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:25:31.566506 master-0 kubenswrapper[7479]: I0308 00:25:31.566455 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:25:33.162260 master-0 kubenswrapper[7479]: I0308 00:25:33.162144 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:33.162260 master-0 kubenswrapper[7479]: I0308 00:25:33.162227 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: I0308 00:25:33.390705 7479 patch_prober.go:28] interesting pod/apiserver-85cb8cb9bb-bmx44 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]log ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [-]etcd failed: reason withheld Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:25:33.390766 master-0 kubenswrapper[7479]: livez check failed Mar 08 00:25:33.391399 master-0 kubenswrapper[7479]: I0308 00:25:33.390789 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" podUID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:25:34.001969 master-0 kubenswrapper[7479]: I0308 00:25:34.001933 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:34.002246 master-0 kubenswrapper[7479]: I0308 00:25:34.002190 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:37.002788 master-0 kubenswrapper[7479]: I0308 00:25:37.002731 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:37.003342 master-0 kubenswrapper[7479]: I0308 00:25:37.002801 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:40.002674 master-0 kubenswrapper[7479]: I0308 00:25:40.002614 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:40.003178 master-0 kubenswrapper[7479]: I0308 00:25:40.002688 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:41.566681 master-0 kubenswrapper[7479]: I0308 00:25:41.566600 7479 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-dkqc4 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" start-of-body= Mar 08 00:25:41.567309 master-0 kubenswrapper[7479]: I0308 00:25:41.566697 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.7:8443/healthz\": dial tcp 10.128.0.7:8443: connect: connection refused" Mar 08 00:25:43.002153 master-0 kubenswrapper[7479]: I0308 00:25:43.002087 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:43.002153 master-0 kubenswrapper[7479]: I0308 00:25:43.002142 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:43.145572 master-0 kubenswrapper[7479]: I0308 00:25:43.145521 7479 patch_prober.go:28] interesting pod/controller-manager-5b4bdf67b6-8rdjs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" start-of-body= Mar 08 00:25:43.145572 master-0 kubenswrapper[7479]: I0308 00:25:43.145567 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": dial tcp 10.128.0.50:8443: connect: connection refused" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: E0308 00:25:44.133925 7479 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="28.195s" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.133964 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7"} Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134024 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3"} Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134046 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134060 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerDied","Data":"322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03"} Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134082 7479 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" containerID="cri-o://1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134093 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134110 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134121 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134247 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134391 7479 scope.go:117] "RemoveContainer" containerID="3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.134435 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.135117 7479 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vnl28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:25:44.136502 master-0 kubenswrapper[7479]: I0308 00:25:44.135155 7479 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" podUID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.26:8443/healthz\": dial tcp 10.128.0.26:8443: connect: connection refused" Mar 08 00:25:44.138564 master-0 kubenswrapper[7479]: I0308 00:25:44.136754 7479 scope.go:117] "RemoveContainer" containerID="92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807" Mar 08 00:25:44.154492 master-0 kubenswrapper[7479]: I0308 00:25:44.154151 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.156930 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.156986 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerDied","Data":"c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157027 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerDied","Data":"5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157112 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157130 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157171 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157196 7479 status_manager.go:379] "Container startup changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157227 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157243 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157257 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerDied","Data":"8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157288 7479 status_manager.go:379] "Container startup changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157300 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157312 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157323 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerDied","Data":"08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157336 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157347 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerDied","Data":"00aa20318a390dc28a1b90d9dfa760b9b264408ce2a090ec0af81099188274b0"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157372 7479 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157384 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerDied","Data":"459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157460 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157542 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157586 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157623 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157695 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerDied","Data":"5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157732 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerDied","Data":"04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157766 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerDied","Data":"1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157802 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerDied","Data":"f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157833 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerDied","Data":"01f4711968edd90a03ce566521bccad3babf877143c30f69324972ce8a8bc2ae"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157866 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157917 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.157972 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"632cf41c6d751c39c9bc533a8eb31489a926eb05ad69c14fc4cbdd3ab7d57165"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158003 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"62a90dd1c822377c4aa48689f26940e9273c8eaf2e5b09cbf6dadaba768ab7d5"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158062 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"dc254aaf3bd5aa2a3c6e69f8abd5a98d092e318f7ea622432462747a16cce142"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158092 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"e48e7bed76a9d2cfdf09898508b2c13d610c4aac80f76a7b83dcca91233aa06a"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158119 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158168 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"171aa9f17bab1693340df88dc9687b17839bec3452bff1e75aeedd920e40b060"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158235 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerStarted","Data":"76a35028a8d9b23a680ded5da7f57ea40c69742d5b697c8b44c79baa58b379ed"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158265 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"9640b5a39ba1c8d22970de560d1644963302e95dae8ebd4e31dc3deaa2d4d495"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158301 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158334 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerStarted","Data":"96ba39646fac17d0697e88bae6a2ecb9f089f04e9a05c825a6c18dbce7611ea1"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158377 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerStarted","Data":"75840c04f6b695db51ec61cebbf998b4b3060ea46b87261c880157ccbd62f9ba"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158423 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerStarted","Data":"780095bbe85e78933cef6be83dd1325e378e4033a839880b601dba51dbb6eb8a"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158421 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d"} pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158455 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158498 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" podUID="58333089-2456-4a25-8ba7-6d557eefa177" containerName="authentication-operator" containerID="cri-o://dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d" gracePeriod=30 Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158501 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158665 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerStarted","Data":"362e2ee6abee655de6dfb5a64eddbe14fe4be437a3b12293690dd8327410ffad"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158774 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerDied","Data":"52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158796 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"0e06c006df1e1e63e0f6188a23b5e393fde4aa4984ad610de00e8c675da914c7"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158809 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ea5ec65ba12dfaaa4f58b3b64547a3d98d2937c3aa58a7bc6dc14040003a38a9"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158822 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"d8889d6936248c826e33628006d790b900bbbcacc9529b4c35a79aa987893d39"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158842 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"787fa634ee36f327997b592447e9aadba40183c4e7e4d25f5519ae9957121e6e"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158854 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"4262f462df3c892c070c1769f302b6c7878bc5f82d5342928245d488b3431f6d"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158865 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerDied","Data":"92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158883 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerDied","Data":"d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158898 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerDied","Data":"25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158914 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158946 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158960 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"a09a8a648b8b5d27ffa03ef33629e6462dc3a71bc00700334560a54ac0509ef1"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.158973 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.159017 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.159035 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"e8ef418892e89b7f3833d29c636f71f3f5b9cf6ffda7232c93e00417ddde5f8d"} Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.160267 7479 scope.go:117] "RemoveContainer" containerID="f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.160879 7479 scope.go:117] "RemoveContainer" containerID="52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379" Mar 08 00:25:44.162722 master-0 kubenswrapper[7479]: I0308 00:25:44.161643 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.167802 master-0 kubenswrapper[7479]: I0308 00:25:44.164543 7479 scope.go:117] "RemoveContainer" containerID="d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922" Mar 08 00:25:44.167802 master-0 kubenswrapper[7479]: I0308 00:25:44.166078 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:44.182721 master-0 kubenswrapper[7479]: I0308 00:25:44.182610 7479 scope.go:117] "RemoveContainer" containerID="155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" Mar 08 00:25:44.222077 master-0 kubenswrapper[7479]: I0308 00:25:44.220589 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:25:44.223030 master-0 kubenswrapper[7479]: I0308 00:25:44.222985 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 00:25:44.223101 master-0 kubenswrapper[7479]: I0308 00:25:44.223019 7479 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="e59c6407-b1d9-4828-a7a4-931a9e3567df" Mar 08 00:25:44.238411 master-0 kubenswrapper[7479]: I0308 00:25:44.238266 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 00:25:44.241195 master-0 kubenswrapper[7479]: I0308 00:25:44.241123 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ms5vp" podStartSLOduration=184.85240499 podStartE2EDuration="3m32.241104861s" podCreationTimestamp="2026-03-08 00:22:12 +0000 UTC" firstStartedPulling="2026-03-08 00:22:14.497542111 +0000 UTC m=+50.810451028" lastFinishedPulling="2026-03-08 00:22:41.886241982 +0000 UTC m=+78.199150899" observedRunningTime="2026-03-08 00:25:44.130926541 +0000 UTC m=+260.443835488" watchObservedRunningTime="2026-03-08 00:25:44.241104861 +0000 UTC m=+260.554013778" Mar 08 00:25:44.241531 master-0 kubenswrapper[7479]: I0308 00:25:44.241491 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 00:25:44.241531 master-0 kubenswrapper[7479]: I0308 00:25:44.241522 7479 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="e59c6407-b1d9-4828-a7a4-931a9e3567df" Mar 08 00:25:44.282552 master-0 kubenswrapper[7479]: I0308 00:25:44.282504 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-brq9l"] Mar 08 00:25:44.282875 master-0 kubenswrapper[7479]: E0308 00:25:44.282852 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:25:44.282875 master-0 kubenswrapper[7479]: I0308 00:25:44.282876 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: E0308 00:25:44.282910 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: I0308 00:25:44.282917 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: E0308 00:25:44.282929 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df57519-dc14-4d18-8c24-cf2e6e122cff" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: I0308 00:25:44.282936 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df57519-dc14-4d18-8c24-cf2e6e122cff" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: E0308 00:25:44.282944 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" containerName="installer" Mar 08 00:25:44.282955 master-0 kubenswrapper[7479]: I0308 00:25:44.282949 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" containerName="installer" Mar 08 00:25:44.283122 master-0 kubenswrapper[7479]: I0308 00:25:44.283028 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" containerName="installer" Mar 08 00:25:44.283122 master-0 kubenswrapper[7479]: I0308 00:25:44.283036 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df57519-dc14-4d18-8c24-cf2e6e122cff" containerName="installer" Mar 08 00:25:44.283122 master-0 kubenswrapper[7479]: I0308 00:25:44.283046 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:25:44.283122 master-0 kubenswrapper[7479]: I0308 00:25:44.283056 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:25:44.283462 master-0 kubenswrapper[7479]: I0308 00:25:44.283431 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.288063 master-0 kubenswrapper[7479]: I0308 00:25:44.288019 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 00:25:44.288137 master-0 kubenswrapper[7479]: I0308 00:25:44.288065 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 00:25:44.288361 master-0 kubenswrapper[7479]: I0308 00:25:44.288338 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 00:25:44.288494 master-0 kubenswrapper[7479]: I0308 00:25:44.288472 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 00:25:44.288562 master-0 kubenswrapper[7479]: I0308 00:25:44.288497 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 00:25:44.301550 master-0 kubenswrapper[7479]: I0308 00:25:44.301515 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-brq9l"] Mar 08 00:25:44.330872 master-0 kubenswrapper[7479]: I0308 00:25:44.330834 7479 scope.go:117] "RemoveContainer" containerID="e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e" Mar 08 00:25:44.338216 master-0 kubenswrapper[7479]: I0308 00:25:44.337912 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lqc4n" podStartSLOduration=184.938690464 podStartE2EDuration="3m32.337897174s" podCreationTimestamp="2026-03-08 00:22:12 +0000 UTC" firstStartedPulling="2026-03-08 00:22:14.48805967 +0000 UTC m=+50.800968577" lastFinishedPulling="2026-03-08 00:22:41.88726635 +0000 UTC m=+78.200175287" observedRunningTime="2026-03-08 00:25:44.336359757 +0000 UTC m=+260.649268674" watchObservedRunningTime="2026-03-08 00:25:44.337897174 +0000 UTC m=+260.650806091" Mar 08 00:25:44.362287 master-0 kubenswrapper[7479]: I0308 00:25:44.361581 7479 scope.go:117] "RemoveContainer" containerID="65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4" Mar 08 00:25:44.402598 master-0 kubenswrapper[7479]: I0308 00:25:44.399657 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.402598 master-0 kubenswrapper[7479]: I0308 00:25:44.399732 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.402598 master-0 kubenswrapper[7479]: I0308 00:25:44.399769 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.402598 master-0 kubenswrapper[7479]: I0308 00:25:44.399825 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.402598 master-0 kubenswrapper[7479]: I0308 00:25:44.399875 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5hl\" (UniqueName: \"kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.422406 master-0 kubenswrapper[7479]: I0308 00:25:44.422375 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:25:44.429858 master-0 kubenswrapper[7479]: I0308 00:25:44.429799 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 00:25:44.446745 master-0 kubenswrapper[7479]: I0308 00:25:44.446711 7479 scope.go:117] "RemoveContainer" containerID="ba0bd870ef36ff11021b6ac2e87095fcc7b137992295cf86faa86e55d1530ce8" Mar 08 00:25:44.456361 master-0 kubenswrapper[7479]: I0308 00:25:44.456283 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4r9ht" podStartSLOduration=184.078974218 podStartE2EDuration="3m30.456261883s" podCreationTimestamp="2026-03-08 00:22:14 +0000 UTC" firstStartedPulling="2026-03-08 00:22:15.512406413 +0000 UTC m=+51.825315330" lastFinishedPulling="2026-03-08 00:22:41.889694068 +0000 UTC m=+78.202602995" observedRunningTime="2026-03-08 00:25:44.45549901 +0000 UTC m=+260.768407937" watchObservedRunningTime="2026-03-08 00:25:44.456261883 +0000 UTC m=+260.769170800" Mar 08 00:25:44.501935 master-0 kubenswrapper[7479]: I0308 00:25:44.501885 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.502045 master-0 kubenswrapper[7479]: I0308 00:25:44.501951 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5hl\" (UniqueName: \"kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.502852 master-0 kubenswrapper[7479]: I0308 00:25:44.502807 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.503074 master-0 kubenswrapper[7479]: I0308 00:25:44.503038 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.503157 master-0 kubenswrapper[7479]: I0308 00:25:44.503129 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.503238 master-0 kubenswrapper[7479]: I0308 00:25:44.503217 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.504240 master-0 kubenswrapper[7479]: I0308 00:25:44.504196 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.505021 master-0 kubenswrapper[7479]: I0308 00:25:44.504965 7479 scope.go:117] "RemoveContainer" containerID="e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e" Mar 08 00:25:44.505395 master-0 kubenswrapper[7479]: I0308 00:25:44.505348 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.511235 master-0 kubenswrapper[7479]: I0308 00:25:44.511172 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.522727 master-0 kubenswrapper[7479]: E0308 00:25:44.522634 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e\": container with ID starting with e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e not found: ID does not exist" containerID="e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e" Mar 08 00:25:44.522727 master-0 kubenswrapper[7479]: I0308 00:25:44.522696 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e"} err="failed to get container status \"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e\": rpc error: code = NotFound desc = could not find container \"e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e\": container with ID starting with e58959bf4fb7686cb173d693e7cd0607617c802ee64cc2a69626a79e65982a9e not found: ID does not exist" Mar 08 00:25:44.522727 master-0 kubenswrapper[7479]: I0308 00:25:44.522726 7479 scope.go:117] "RemoveContainer" containerID="65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4" Mar 08 00:25:44.526290 master-0 kubenswrapper[7479]: I0308 00:25:44.523939 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mr22p" podStartSLOduration=194.265260083 podStartE2EDuration="3m29.52392434s" podCreationTimestamp="2026-03-08 00:22:15 +0000 UTC" firstStartedPulling="2026-03-08 00:22:26.78590394 +0000 UTC m=+63.098812857" lastFinishedPulling="2026-03-08 00:22:42.044568197 +0000 UTC m=+78.357477114" observedRunningTime="2026-03-08 00:25:44.487714829 +0000 UTC m=+260.800623756" watchObservedRunningTime="2026-03-08 00:25:44.52392434 +0000 UTC m=+260.836833257" Mar 08 00:25:44.530409 master-0 kubenswrapper[7479]: E0308 00:25:44.529066 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4\": container with ID starting with 65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4 not found: ID does not exist" containerID="65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4" Mar 08 00:25:44.530409 master-0 kubenswrapper[7479]: I0308 00:25:44.529213 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4"} err="failed to get container status \"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4\": rpc error: code = NotFound desc = could not find container \"65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4\": container with ID starting with 65f78e69463513d95a1d7e0bffe5e5d1bf7a6e5e4e7e1d096d77f2d24eb8e8b4 not found: ID does not exist" Mar 08 00:25:44.530409 master-0 kubenswrapper[7479]: I0308 00:25:44.529249 7479 scope.go:117] "RemoveContainer" containerID="8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae" Mar 08 00:25:44.535889 master-0 kubenswrapper[7479]: I0308 00:25:44.535003 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5hl\" (UniqueName: \"kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.541459 master-0 kubenswrapper[7479]: I0308 00:25:44.541421 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:25:44.552448 master-0 kubenswrapper[7479]: I0308 00:25:44.550659 7479 scope.go:117] "RemoveContainer" containerID="3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" Mar 08 00:25:44.552448 master-0 kubenswrapper[7479]: I0308 00:25:44.551699 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 00:25:44.552616 master-0 kubenswrapper[7479]: E0308 00:25:44.552527 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3\": container with ID starting with 3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3 not found: ID does not exist" containerID="3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" Mar 08 00:25:44.552616 master-0 kubenswrapper[7479]: I0308 00:25:44.552563 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3"} err="failed to get container status \"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3\": rpc error: code = NotFound desc = could not find container \"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3\": container with ID starting with 3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3 not found: ID does not exist" Mar 08 00:25:44.552616 master-0 kubenswrapper[7479]: I0308 00:25:44.552589 7479 scope.go:117] "RemoveContainer" containerID="155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" Mar 08 00:25:44.552961 master-0 kubenswrapper[7479]: E0308 00:25:44.552943 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7\": container with ID starting with 155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7 not found: ID does not exist" containerID="155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" Mar 08 00:25:44.552961 master-0 kubenswrapper[7479]: I0308 00:25:44.552963 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7"} err="failed to get container status \"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7\": rpc error: code = NotFound desc = could not find container \"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7\": container with ID starting with 155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7 not found: ID does not exist" Mar 08 00:25:44.553312 master-0 kubenswrapper[7479]: I0308 00:25:44.552975 7479 scope.go:117] "RemoveContainer" containerID="4ce369a140420a6c03e974e6eff3c092d5ec9b95e895b002c78c7a3f070c22b2" Mar 08 00:25:44.578444 master-0 kubenswrapper[7479]: I0308 00:25:44.578405 7479 scope.go:117] "RemoveContainer" containerID="06038340b4e3f2befb44d9c767edb4dd565cb0800261ba9f5e36429d3a7bf10d" Mar 08 00:25:44.597836 master-0 kubenswrapper[7479]: I0308 00:25:44.597801 7479 scope.go:117] "RemoveContainer" containerID="8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae" Mar 08 00:25:44.598181 master-0 kubenswrapper[7479]: E0308 00:25:44.598157 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae\": container with ID starting with 8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae not found: ID does not exist" containerID="8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae" Mar 08 00:25:44.598231 master-0 kubenswrapper[7479]: I0308 00:25:44.598183 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae"} err="failed to get container status \"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae\": rpc error: code = NotFound desc = could not find container \"8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae\": container with ID starting with 8ec59eab2cc718c53b1b2d5e8d3d1b9c2d696a3beee682fea6bed575feafd5ae not found: ID does not exist" Mar 08 00:25:44.598231 master-0 kubenswrapper[7479]: I0308 00:25:44.598220 7479 scope.go:117] "RemoveContainer" containerID="3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3" Mar 08 00:25:44.598428 master-0 kubenswrapper[7479]: I0308 00:25:44.598407 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3"} err="failed to get container status \"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3\": rpc error: code = NotFound desc = could not find container \"3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3\": container with ID starting with 3749dd604b27f2dbdd4183a6654ab983fd1314c356306c8758ffbb94949f42f3 not found: ID does not exist" Mar 08 00:25:44.598428 master-0 kubenswrapper[7479]: I0308 00:25:44.598422 7479 scope.go:117] "RemoveContainer" containerID="155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7" Mar 08 00:25:44.598622 master-0 kubenswrapper[7479]: I0308 00:25:44.598601 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7"} err="failed to get container status \"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7\": rpc error: code = NotFound desc = could not find container \"155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7\": container with ID starting with 155eb1b0e5550c7156a1e22df86f430ed6ad309e3fdf823622a81280cc05efe7 not found: ID does not exist" Mar 08 00:25:44.733718 master-0 kubenswrapper[7479]: I0308 00:25:44.733617 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:25:44.940472 master-0 kubenswrapper[7479]: I0308 00:25:44.940363 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=29.940345451 podStartE2EDuration="29.940345451s" podCreationTimestamp="2026-03-08 00:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:44.938312911 +0000 UTC m=+261.251221828" watchObservedRunningTime="2026-03-08 00:25:44.940345451 +0000 UTC m=+261.253254368" Mar 08 00:25:45.076034 master-0 kubenswrapper[7479]: I0308 00:25:45.075976 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/1.log" Mar 08 00:25:45.076334 master-0 kubenswrapper[7479]: I0308 00:25:45.076277 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198"} Mar 08 00:25:45.079745 master-0 kubenswrapper[7479]: I0308 00:25:45.079699 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerStarted","Data":"f70bb9a5f0e3f9b911feb28654c30e151d3e1fb5d9549e6e2016049387b17fb2"} Mar 08 00:25:45.079822 master-0 kubenswrapper[7479]: I0308 00:25:45.079750 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerStarted","Data":"f81e16a049afccd7df86e2ab910ff92e4bea5bed8e76ac4e62191e1c15f7228a"} Mar 08 00:25:45.084497 master-0 kubenswrapper[7479]: I0308 00:25:45.084457 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:45.086283 master-0 kubenswrapper[7479]: I0308 00:25:45.086251 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/1.log" Mar 08 00:25:45.089001 master-0 kubenswrapper[7479]: I0308 00:25:45.088966 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:25:45.093711 master-0 kubenswrapper[7479]: I0308 00:25:45.090473 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerStarted","Data":"a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255"} Mar 08 00:25:45.093711 master-0 kubenswrapper[7479]: I0308 00:25:45.091264 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:25:45.094941 master-0 kubenswrapper[7479]: I0308 00:25:45.094908 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:25:45.096255 master-0 kubenswrapper[7479]: I0308 00:25:45.095655 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vnl28_2b1a69b5-c946-495d-ae02-c56f788279e8/openshift-config-operator/2.log" Mar 08 00:25:45.098513 master-0 kubenswrapper[7479]: I0308 00:25:45.098462 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:25:45.102833 master-0 kubenswrapper[7479]: I0308 00:25:45.102794 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-27phk_2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/etcd-operator/1.log" Mar 08 00:25:45.102915 master-0 kubenswrapper[7479]: I0308 00:25:45.102897 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"94f6cbcf36ce22a8ad98b49d60bec50375421ad5c3b08a57f781b8f9d633b332"} Mar 08 00:25:45.106392 master-0 kubenswrapper[7479]: I0308 00:25:45.106343 7479 generic.go:334] "Generic (PLEG): container finished" podID="58333089-2456-4a25-8ba7-6d557eefa177" containerID="dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d" exitCode=0 Mar 08 00:25:45.106492 master-0 kubenswrapper[7479]: I0308 00:25:45.106406 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerDied","Data":"dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d"} Mar 08 00:25:45.106492 master-0 kubenswrapper[7479]: I0308 00:25:45.106436 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"a93852bdddf78dff65ecf8b8ffc6457b3e060c3ee09b055521d9a24e262b9408"} Mar 08 00:25:45.106492 master-0 kubenswrapper[7479]: I0308 00:25:45.106454 7479 scope.go:117] "RemoveContainer" containerID="00aa20318a390dc28a1b90d9dfa760b9b264408ce2a090ec0af81099188274b0" Mar 08 00:25:45.108062 master-0 kubenswrapper[7479]: I0308 00:25:45.108030 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:25:45.113446 master-0 kubenswrapper[7479]: I0308 00:25:45.112506 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"c756595c785c16416805ae901384336bd79f4ee2a5921d1dafe30a90cfdb5b66"} Mar 08 00:25:45.114535 master-0 kubenswrapper[7479]: I0308 00:25:45.114499 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/1.log" Mar 08 00:25:45.117944 master-0 kubenswrapper[7479]: I0308 00:25:45.117908 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:25:45.117944 master-0 kubenswrapper[7479]: I0308 00:25:45.117943 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:25:45.118057 master-0 kubenswrapper[7479]: I0308 00:25:45.117997 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:25:45.120730 master-0 kubenswrapper[7479]: I0308 00:25:45.120582 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:25:45.123347 master-0 kubenswrapper[7479]: I0308 00:25:45.123310 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:25:45.130465 master-0 kubenswrapper[7479]: I0308 00:25:45.130416 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 00:25:45.146261 master-0 kubenswrapper[7479]: E0308 00:25:45.145979 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 08 00:25:45.163420 master-0 kubenswrapper[7479]: I0308 00:25:45.163330 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-brq9l"] Mar 08 00:25:45.610810 master-0 kubenswrapper[7479]: I0308 00:25:45.610705 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=197.610683494 podStartE2EDuration="3m17.610683494s" podCreationTimestamp="2026-03-08 00:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:45.608057504 +0000 UTC m=+261.920966421" watchObservedRunningTime="2026-03-08 00:25:45.610683494 +0000 UTC m=+261.923592411" Mar 08 00:25:45.892818 master-0 kubenswrapper[7479]: I0308 00:25:45.891572 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df57519-dc14-4d18-8c24-cf2e6e122cff" path="/var/lib/kubelet/pods/5df57519-dc14-4d18-8c24-cf2e6e122cff/volumes" Mar 08 00:25:45.892818 master-0 kubenswrapper[7479]: I0308 00:25:45.891983 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada20442-bff5-477c-989e-3d921f5ede5e" path="/var/lib/kubelet/pods/ada20442-bff5-477c-989e-3d921f5ede5e/volumes" Mar 08 00:25:46.126547 master-0 kubenswrapper[7479]: I0308 00:25:46.126490 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerStarted","Data":"a0d7955b7085045599d0a7ea45ff20f907bc225ec27c46ed3dcc33b59207b912"} Mar 08 00:25:46.831580 master-0 kubenswrapper[7479]: I0308 00:25:46.831516 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:25:46.832108 master-0 kubenswrapper[7479]: I0308 00:25:46.831738 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ms5vp" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="registry-server" containerID="cri-o://637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269" gracePeriod=2 Mar 08 00:25:47.709436 master-0 kubenswrapper[7479]: I0308 00:25:47.707353 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6t5lg"] Mar 08 00:25:47.709436 master-0 kubenswrapper[7479]: I0308 00:25:47.708854 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.716172 master-0 kubenswrapper[7479]: I0308 00:25:47.716064 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-8z76k" Mar 08 00:25:47.761517 master-0 kubenswrapper[7479]: I0308 00:25:47.761460 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll99v\" (UniqueName: \"kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.761726 master-0 kubenswrapper[7479]: I0308 00:25:47.761574 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.761807 master-0 kubenswrapper[7479]: I0308 00:25:47.761731 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.783607 master-0 kubenswrapper[7479]: I0308 00:25:47.783562 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6t5lg"] Mar 08 00:25:47.885231 master-0 kubenswrapper[7479]: I0308 00:25:47.883038 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll99v\" (UniqueName: \"kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.885231 master-0 kubenswrapper[7479]: I0308 00:25:47.883107 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.885231 master-0 kubenswrapper[7479]: I0308 00:25:47.883301 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.885231 master-0 kubenswrapper[7479]: I0308 00:25:47.883680 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.885231 master-0 kubenswrapper[7479]: I0308 00:25:47.883764 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:47.891176 master-0 kubenswrapper[7479]: I0308 00:25:47.891139 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:25:47.984691 master-0 kubenswrapper[7479]: I0308 00:25:47.984659 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs7rj\" (UniqueName: \"kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj\") pod \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " Mar 08 00:25:47.984932 master-0 kubenswrapper[7479]: I0308 00:25:47.984918 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities\") pod \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " Mar 08 00:25:47.985071 master-0 kubenswrapper[7479]: I0308 00:25:47.985046 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content\") pod \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\" (UID: \"668ffbde-4771-43e1-8f0e-d4b5d17ff693\") " Mar 08 00:25:47.987050 master-0 kubenswrapper[7479]: I0308 00:25:47.987028 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities" (OuterVolumeSpecName: "utilities") pod "668ffbde-4771-43e1-8f0e-d4b5d17ff693" (UID: "668ffbde-4771-43e1-8f0e-d4b5d17ff693"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:47.988620 master-0 kubenswrapper[7479]: I0308 00:25:47.988583 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj" (OuterVolumeSpecName: "kube-api-access-xs7rj") pod "668ffbde-4771-43e1-8f0e-d4b5d17ff693" (UID: "668ffbde-4771-43e1-8f0e-d4b5d17ff693"). InnerVolumeSpecName "kube-api-access-xs7rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:48.064061 master-0 kubenswrapper[7479]: I0308 00:25:48.063957 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "668ffbde-4771-43e1-8f0e-d4b5d17ff693" (UID: "668ffbde-4771-43e1-8f0e-d4b5d17ff693"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:48.084120 master-0 kubenswrapper[7479]: I0308 00:25:48.084075 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll99v\" (UniqueName: \"kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:48.093817 master-0 kubenswrapper[7479]: I0308 00:25:48.087880 7479 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:48.093817 master-0 kubenswrapper[7479]: I0308 00:25:48.087923 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs7rj\" (UniqueName: \"kubernetes.io/projected/668ffbde-4771-43e1-8f0e-d4b5d17ff693-kube-api-access-xs7rj\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:48.093817 master-0 kubenswrapper[7479]: I0308 00:25:48.087938 7479 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668ffbde-4771-43e1-8f0e-d4b5d17ff693-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:48.158743 master-0 kubenswrapper[7479]: I0308 00:25:48.158691 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5"] Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: E0308 00:25:48.158902 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="extract-utilities" Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: I0308 00:25:48.158917 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="extract-utilities" Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: E0308 00:25:48.158932 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="extract-content" Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: I0308 00:25:48.158940 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="extract-content" Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: E0308 00:25:48.158951 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="registry-server" Mar 08 00:25:48.158969 master-0 kubenswrapper[7479]: I0308 00:25:48.158961 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="registry-server" Mar 08 00:25:48.159185 master-0 kubenswrapper[7479]: I0308 00:25:48.159081 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerName="registry-server" Mar 08 00:25:48.160332 master-0 kubenswrapper[7479]: I0308 00:25:48.159901 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.182729 master-0 kubenswrapper[7479]: I0308 00:25:48.178470 7479 generic.go:334] "Generic (PLEG): container finished" podID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" containerID="637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269" exitCode=0 Mar 08 00:25:48.182729 master-0 kubenswrapper[7479]: I0308 00:25:48.178516 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerDied","Data":"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269"} Mar 08 00:25:48.182729 master-0 kubenswrapper[7479]: I0308 00:25:48.178543 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5vp" event={"ID":"668ffbde-4771-43e1-8f0e-d4b5d17ff693","Type":"ContainerDied","Data":"053ec9ee75c18a0fbe26d2f98131f6f6b38d1545596ef812b5dd85b824a65cfd"} Mar 08 00:25:48.182729 master-0 kubenswrapper[7479]: I0308 00:25:48.178561 7479 scope.go:117] "RemoveContainer" containerID="637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269" Mar 08 00:25:48.182729 master-0 kubenswrapper[7479]: I0308 00:25:48.178700 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5vp" Mar 08 00:25:48.183860 master-0 kubenswrapper[7479]: W0308 00:25:48.183838 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.184076 master-0 kubenswrapper[7479]: W0308 00:25:48.184043 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.184141 master-0 kubenswrapper[7479]: E0308 00:25:48.184092 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.184222 master-0 kubenswrapper[7479]: E0308 00:25:48.184063 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.184282 master-0 kubenswrapper[7479]: W0308 00:25:48.183993 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.184360 master-0 kubenswrapper[7479]: E0308 00:25:48.184346 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.184465 master-0 kubenswrapper[7479]: W0308 00:25:48.184453 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-njqpw": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-njqpw" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.184542 master-0 kubenswrapper[7479]: E0308 00:25:48.184524 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-njqpw\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-njqpw\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.184969 master-0 kubenswrapper[7479]: W0308 00:25:48.184604 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.185084 master-0 kubenswrapper[7479]: E0308 00:25:48.185065 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.185425 master-0 kubenswrapper[7479]: W0308 00:25:48.185406 7479 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'master-0' and this object Mar 08 00:25:48.185526 master-0 kubenswrapper[7479]: E0308 00:25:48.185505 7479 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 08 00:25:48.213338 master-0 kubenswrapper[7479]: I0308 00:25:48.212734 7479 scope.go:117] "RemoveContainer" containerID="83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b" Mar 08 00:25:48.228652 master-0 kubenswrapper[7479]: I0308 00:25:48.220243 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q"] Mar 08 00:25:48.228652 master-0 kubenswrapper[7479]: I0308 00:25:48.220880 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.228842 master-0 kubenswrapper[7479]: I0308 00:25:48.228809 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-94mhc" Mar 08 00:25:48.229023 master-0 kubenswrapper[7479]: I0308 00:25:48.229002 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:25:48.229150 master-0 kubenswrapper[7479]: I0308 00:25:48.229129 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:25:48.229302 master-0 kubenswrapper[7479]: I0308 00:25:48.229124 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:25:48.262232 master-0 kubenswrapper[7479]: I0308 00:25:48.253902 7479 scope.go:117] "RemoveContainer" containerID="e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.296963 7479 scope.go:117] "RemoveContainer" containerID="637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300742 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300777 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnrc\" (UniqueName: \"kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300818 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300842 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300863 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.303187 master-0 kubenswrapper[7479]: I0308 00:25:48.300882 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.306831 master-0 kubenswrapper[7479]: I0308 00:25:48.306550 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q"] Mar 08 00:25:48.315591 master-0 kubenswrapper[7479]: E0308 00:25:48.315489 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269\": container with ID starting with 637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269 not found: ID does not exist" containerID="637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269" Mar 08 00:25:48.315591 master-0 kubenswrapper[7479]: I0308 00:25:48.315539 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269"} err="failed to get container status \"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269\": rpc error: code = NotFound desc = could not find container \"637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269\": container with ID starting with 637e374cbb2d700466609d264cbc2ba0c4e3852a252708b6a9d14095bf02d269 not found: ID does not exist" Mar 08 00:25:48.315591 master-0 kubenswrapper[7479]: I0308 00:25:48.315564 7479 scope.go:117] "RemoveContainer" containerID="83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b" Mar 08 00:25:48.317415 master-0 kubenswrapper[7479]: E0308 00:25:48.317375 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b\": container with ID starting with 83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b not found: ID does not exist" containerID="83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b" Mar 08 00:25:48.317473 master-0 kubenswrapper[7479]: I0308 00:25:48.317412 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b"} err="failed to get container status \"83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b\": rpc error: code = NotFound desc = could not find container \"83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b\": container with ID starting with 83487d45dedbfe9b5fe7bba1c70e4990a428d9f9c9fb3cb86a8a3aa56bb1ac0b not found: ID does not exist" Mar 08 00:25:48.317473 master-0 kubenswrapper[7479]: I0308 00:25:48.317435 7479 scope.go:117] "RemoveContainer" containerID="e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e" Mar 08 00:25:48.327744 master-0 kubenswrapper[7479]: E0308 00:25:48.327647 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e\": container with ID starting with e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e not found: ID does not exist" containerID="e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e" Mar 08 00:25:48.327744 master-0 kubenswrapper[7479]: I0308 00:25:48.327697 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e"} err="failed to get container status \"e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e\": rpc error: code = NotFound desc = could not find container \"e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e\": container with ID starting with e639c8e4390f3d9bd210ac1cb787b51a40fdf81916ce52641e0a03a8306a158e not found: ID does not exist" Mar 08 00:25:48.332587 master-0 kubenswrapper[7479]: I0308 00:25:48.332544 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:25:48.357985 master-0 kubenswrapper[7479]: I0308 00:25:48.354510 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst"] Mar 08 00:25:48.357985 master-0 kubenswrapper[7479]: I0308 00:25:48.355074 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.383682 master-0 kubenswrapper[7479]: I0308 00:25:48.383574 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:25:48.383856 master-0 kubenswrapper[7479]: I0308 00:25:48.383841 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:25:48.385435 master-0 kubenswrapper[7479]: I0308 00:25:48.385181 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:25:48.385435 master-0 kubenswrapper[7479]: I0308 00:25:48.385383 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-gqbjw" Mar 08 00:25:48.407871 master-0 kubenswrapper[7479]: I0308 00:25:48.407830 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.408018 master-0 kubenswrapper[7479]: I0308 00:25:48.407900 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.408018 master-0 kubenswrapper[7479]: I0308 00:25:48.407925 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnrc\" (UniqueName: \"kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.408018 master-0 kubenswrapper[7479]: I0308 00:25:48.407962 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.408018 master-0 kubenswrapper[7479]: I0308 00:25:48.407984 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.408018 master-0 kubenswrapper[7479]: I0308 00:25:48.408006 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:48.408837 master-0 kubenswrapper[7479]: I0308 00:25:48.408802 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:25:48.412710 master-0 kubenswrapper[7479]: I0308 00:25:48.412684 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst"] Mar 08 00:25:48.413013 master-0 kubenswrapper[7479]: I0308 00:25:48.412982 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.443925 master-0 kubenswrapper[7479]: I0308 00:25:48.442431 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ms5vp"] Mar 08 00:25:48.452100 master-0 kubenswrapper[7479]: I0308 00:25:48.452056 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf"] Mar 08 00:25:48.452977 master-0 kubenswrapper[7479]: I0308 00:25:48.452948 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.455921 master-0 kubenswrapper[7479]: I0308 00:25:48.455885 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-fp767" Mar 08 00:25:48.460542 master-0 kubenswrapper[7479]: I0308 00:25:48.456224 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:25:48.460542 master-0 kubenswrapper[7479]: I0308 00:25:48.456405 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 00:25:48.466457 master-0 kubenswrapper[7479]: I0308 00:25:48.464370 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 00:25:48.466457 master-0 kubenswrapper[7479]: I0308 00:25:48.464565 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:25:48.466457 master-0 kubenswrapper[7479]: I0308 00:25:48.464683 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 00:25:48.478216 master-0 kubenswrapper[7479]: I0308 00:25:48.477703 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnrc\" (UniqueName: \"kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.509270 master-0 kubenswrapper[7479]: I0308 00:25:48.508707 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8sl\" (UniqueName: \"kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.509270 master-0 kubenswrapper[7479]: I0308 00:25:48.508775 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.568280 master-0 kubenswrapper[7479]: I0308 00:25:48.566099 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q"] Mar 08 00:25:48.568280 master-0 kubenswrapper[7479]: I0308 00:25:48.566929 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.570257 master-0 kubenswrapper[7479]: I0308 00:25:48.569085 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-g5h9b" Mar 08 00:25:48.574794 master-0 kubenswrapper[7479]: I0308 00:25:48.572965 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 00:25:48.574794 master-0 kubenswrapper[7479]: I0308 00:25:48.573092 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 00:25:48.593402 master-0 kubenswrapper[7479]: I0308 00:25:48.590659 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:25:48.593402 master-0 kubenswrapper[7479]: I0308 00:25:48.593052 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q"] Mar 08 00:25:48.619467 master-0 kubenswrapper[7479]: I0308 00:25:48.619329 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.619467 master-0 kubenswrapper[7479]: I0308 00:25:48.619400 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.619467 master-0 kubenswrapper[7479]: I0308 00:25:48.619423 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.619467 master-0 kubenswrapper[7479]: I0308 00:25:48.619445 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8sl\" (UniqueName: \"kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.619467 master-0 kubenswrapper[7479]: I0308 00:25:48.619469 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hksvd\" (UniqueName: \"kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.619783 master-0 kubenswrapper[7479]: I0308 00:25:48.619488 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.619783 master-0 kubenswrapper[7479]: I0308 00:25:48.619515 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.628881 master-0 kubenswrapper[7479]: I0308 00:25:48.628826 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.652880 master-0 kubenswrapper[7479]: I0308 00:25:48.649269 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8sl\" (UniqueName: \"kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.667233 master-0 kubenswrapper[7479]: I0308 00:25:48.657996 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:25:48.667233 master-0 kubenswrapper[7479]: I0308 00:25:48.659434 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mr22p" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="registry-server" containerID="cri-o://0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0" gracePeriod=2 Mar 08 00:25:48.703279 master-0 kubenswrapper[7479]: I0308 00:25:48.703178 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6t5lg"] Mar 08 00:25:48.708087 master-0 kubenswrapper[7479]: I0308 00:25:48.707853 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722477 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722569 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722617 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722647 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hksvd\" (UniqueName: \"kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722690 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.722711 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5kd\" (UniqueName: \"kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.723449 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.723601 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725226 master-0 kubenswrapper[7479]: I0308 00:25:48.724680 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725724 master-0 kubenswrapper[7479]: I0308 00:25:48.725296 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.725953 master-0 kubenswrapper[7479]: I0308 00:25:48.725880 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.732191 master-0 kubenswrapper[7479]: I0308 00:25:48.731194 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.756479 master-0 kubenswrapper[7479]: I0308 00:25:48.756437 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hksvd\" (UniqueName: \"kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd\") pod \"cluster-cloud-controller-manager-operator-559568b945-8lgqf\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.817242 master-0 kubenswrapper[7479]: I0308 00:25:48.816852 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:25:48.825409 master-0 kubenswrapper[7479]: I0308 00:25:48.824721 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.825409 master-0 kubenswrapper[7479]: I0308 00:25:48.824796 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5kd\" (UniqueName: \"kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.825409 master-0 kubenswrapper[7479]: I0308 00:25:48.824834 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.831470 master-0 kubenswrapper[7479]: I0308 00:25:48.829747 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.834796 master-0 kubenswrapper[7479]: I0308 00:25:48.834732 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.859682 master-0 kubenswrapper[7479]: I0308 00:25:48.859639 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5kd\" (UniqueName: \"kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.976885 master-0 kubenswrapper[7479]: I0308 00:25:48.972425 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:25:48.989072 master-0 kubenswrapper[7479]: I0308 00:25:48.988997 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q"] Mar 08 00:25:48.990630 master-0 kubenswrapper[7479]: I0308 00:25:48.990599 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:48.996699 master-0 kubenswrapper[7479]: I0308 00:25:48.993516 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 00:25:48.996699 master-0 kubenswrapper[7479]: I0308 00:25:48.994106 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 00:25:48.996699 master-0 kubenswrapper[7479]: I0308 00:25:48.994387 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-b4z2l" Mar 08 00:25:48.996699 master-0 kubenswrapper[7479]: I0308 00:25:48.994543 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 00:25:49.010379 master-0 kubenswrapper[7479]: I0308 00:25:49.007573 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 00:25:49.011553 master-0 kubenswrapper[7479]: I0308 00:25:49.011088 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q"] Mar 08 00:25:49.055343 master-0 kubenswrapper[7479]: I0308 00:25:49.055293 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:25:49.061285 master-0 kubenswrapper[7479]: I0308 00:25:49.059088 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q"] Mar 08 00:25:49.061527 master-0 kubenswrapper[7479]: I0308 00:25:49.061324 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9j9zs"] Mar 08 00:25:49.061770 master-0 kubenswrapper[7479]: E0308 00:25:49.061740 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="extract-content" Mar 08 00:25:49.061770 master-0 kubenswrapper[7479]: I0308 00:25:49.061761 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="extract-content" Mar 08 00:25:49.061981 master-0 kubenswrapper[7479]: E0308 00:25:49.061776 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="extract-utilities" Mar 08 00:25:49.061981 master-0 kubenswrapper[7479]: I0308 00:25:49.061977 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="extract-utilities" Mar 08 00:25:49.062035 master-0 kubenswrapper[7479]: E0308 00:25:49.061994 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="registry-server" Mar 08 00:25:49.062035 master-0 kubenswrapper[7479]: I0308 00:25:49.062001 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="registry-server" Mar 08 00:25:49.062172 master-0 kubenswrapper[7479]: I0308 00:25:49.062148 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f9c188-df80-4606-9a21-72228cffa706" containerName="registry-server" Mar 08 00:25:49.067848 master-0 kubenswrapper[7479]: I0308 00:25:49.067192 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6"] Mar 08 00:25:49.067848 master-0 kubenswrapper[7479]: I0308 00:25:49.067763 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.068151 master-0 kubenswrapper[7479]: I0308 00:25:49.068123 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073190 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-jt6pk" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073431 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073534 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073631 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073754 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-nppj6" Mar 08 00:25:49.075215 master-0 kubenswrapper[7479]: I0308 00:25:49.073859 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 00:25:49.095694 master-0 kubenswrapper[7479]: I0308 00:25:49.091050 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6"] Mar 08 00:25:49.118360 master-0 kubenswrapper[7479]: I0308 00:25:49.111608 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9j9zs"] Mar 08 00:25:49.132168 master-0 kubenswrapper[7479]: I0308 00:25:49.132122 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content\") pod \"07f9c188-df80-4606-9a21-72228cffa706\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " Mar 08 00:25:49.132321 master-0 kubenswrapper[7479]: I0308 00:25:49.132176 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t44t4\" (UniqueName: \"kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4\") pod \"07f9c188-df80-4606-9a21-72228cffa706\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " Mar 08 00:25:49.132352 master-0 kubenswrapper[7479]: I0308 00:25:49.132330 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities\") pod \"07f9c188-df80-4606-9a21-72228cffa706\" (UID: \"07f9c188-df80-4606-9a21-72228cffa706\") " Mar 08 00:25:49.132543 master-0 kubenswrapper[7479]: I0308 00:25:49.132515 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.132586 master-0 kubenswrapper[7479]: I0308 00:25:49.132561 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.132586 master-0 kubenswrapper[7479]: I0308 00:25:49.132582 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgqmb\" (UniqueName: \"kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.135538 master-0 kubenswrapper[7479]: I0308 00:25:49.135441 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities" (OuterVolumeSpecName: "utilities") pod "07f9c188-df80-4606-9a21-72228cffa706" (UID: "07f9c188-df80-4606-9a21-72228cffa706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:49.138105 master-0 kubenswrapper[7479]: I0308 00:25:49.138063 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4" (OuterVolumeSpecName: "kube-api-access-t44t4") pod "07f9c188-df80-4606-9a21-72228cffa706" (UID: "07f9c188-df80-4606-9a21-72228cffa706"). InnerVolumeSpecName "kube-api-access-t44t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:49.153440 master-0 kubenswrapper[7479]: I0308 00:25:49.153399 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst"] Mar 08 00:25:49.155626 master-0 kubenswrapper[7479]: W0308 00:25:49.155592 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460f09d8_a143_48d2_9db0_be247386984a.slice/crio-78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008 WatchSource:0}: Error finding container 78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008: Status 404 returned error can't find the container with id 78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008 Mar 08 00:25:49.204632 master-0 kubenswrapper[7479]: I0308 00:25:49.204047 7479 generic.go:334] "Generic (PLEG): container finished" podID="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" containerID="04665dc4db4c2d82c8d11a97a36abe0b11fe894bbbd6e5c64a1b3a502d59c374" exitCode=0 Mar 08 00:25:49.204632 master-0 kubenswrapper[7479]: I0308 00:25:49.204148 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerDied","Data":"04665dc4db4c2d82c8d11a97a36abe0b11fe894bbbd6e5c64a1b3a502d59c374"} Mar 08 00:25:49.204632 master-0 kubenswrapper[7479]: I0308 00:25:49.204178 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerStarted","Data":"f661c7de8e4aded6ffb76b6f77c2ac0e5ed6e7e0e7ebfcafe40f9c953ec5ee63"} Mar 08 00:25:49.211119 master-0 kubenswrapper[7479]: I0308 00:25:49.210861 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerStarted","Data":"78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008"} Mar 08 00:25:49.216592 master-0 kubenswrapper[7479]: I0308 00:25:49.215895 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"dc6431dd72c27cd0cc50f525ef4684b1138ca71254e30382dcc7425a8c604797"} Mar 08 00:25:49.221595 master-0 kubenswrapper[7479]: I0308 00:25:49.220696 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr22p" Mar 08 00:25:49.221595 master-0 kubenswrapper[7479]: I0308 00:25:49.220716 7479 generic.go:334] "Generic (PLEG): container finished" podID="07f9c188-df80-4606-9a21-72228cffa706" containerID="0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0" exitCode=0 Mar 08 00:25:49.221595 master-0 kubenswrapper[7479]: I0308 00:25:49.220698 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerDied","Data":"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0"} Mar 08 00:25:49.221595 master-0 kubenswrapper[7479]: I0308 00:25:49.220903 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr22p" event={"ID":"07f9c188-df80-4606-9a21-72228cffa706","Type":"ContainerDied","Data":"9813fb2b0913beedd59707dab5262a0c2df306a822641a8265719695a9f73624"} Mar 08 00:25:49.221595 master-0 kubenswrapper[7479]: I0308 00:25:49.220932 7479 scope.go:117] "RemoveContainer" containerID="0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0" Mar 08 00:25:49.222059 master-0 kubenswrapper[7479]: I0308 00:25:49.222010 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerStarted","Data":"2f2c834142b8089008ca2ed22b0fe66afaaaaa4b94fca36925b116feb711bdca"} Mar 08 00:25:49.223109 master-0 kubenswrapper[7479]: I0308 00:25:49.223088 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerStarted","Data":"ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a"} Mar 08 00:25:49.233434 master-0 kubenswrapper[7479]: I0308 00:25:49.233389 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.233542 master-0 kubenswrapper[7479]: I0308 00:25:49.233436 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgqmb\" (UniqueName: \"kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.233542 master-0 kubenswrapper[7479]: I0308 00:25:49.233474 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.233542 master-0 kubenswrapper[7479]: I0308 00:25:49.233491 7479 scope.go:117] "RemoveContainer" containerID="345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.233496 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.233776 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.233845 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht8zb\" (UniqueName: \"kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.233890 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.233938 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234006 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvvvn\" (UniqueName: \"kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234070 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234127 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234194 7479 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234263 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t44t4\" (UniqueName: \"kubernetes.io/projected/07f9c188-df80-4606-9a21-72228cffa706-kube-api-access-t44t4\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:49.235747 master-0 kubenswrapper[7479]: I0308 00:25:49.234846 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.237752 master-0 kubenswrapper[7479]: I0308 00:25:49.237696 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.256593 master-0 kubenswrapper[7479]: I0308 00:25:49.254252 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgqmb\" (UniqueName: \"kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.263795 master-0 kubenswrapper[7479]: I0308 00:25:49.263409 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" podStartSLOduration=7.326627453 podStartE2EDuration="10.26338096s" podCreationTimestamp="2026-03-08 00:25:39 +0000 UTC" firstStartedPulling="2026-03-08 00:25:45.177615746 +0000 UTC m=+261.490524663" lastFinishedPulling="2026-03-08 00:25:48.114369253 +0000 UTC m=+264.427278170" observedRunningTime="2026-03-08 00:25:49.255574042 +0000 UTC m=+265.568482979" watchObservedRunningTime="2026-03-08 00:25:49.26338096 +0000 UTC m=+265.576289887" Mar 08 00:25:49.273187 master-0 kubenswrapper[7479]: I0308 00:25:49.273089 7479 scope.go:117] "RemoveContainer" containerID="422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967" Mar 08 00:25:49.287645 master-0 kubenswrapper[7479]: I0308 00:25:49.287617 7479 scope.go:117] "RemoveContainer" containerID="0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0" Mar 08 00:25:49.288089 master-0 kubenswrapper[7479]: E0308 00:25:49.288060 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0\": container with ID starting with 0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0 not found: ID does not exist" containerID="0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0" Mar 08 00:25:49.288145 master-0 kubenswrapper[7479]: I0308 00:25:49.288099 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0"} err="failed to get container status \"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0\": rpc error: code = NotFound desc = could not find container \"0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0\": container with ID starting with 0a1a00b75e133f489a7a0acfadc6ee256a844ce0902dd263ed7d688506f410c0 not found: ID does not exist" Mar 08 00:25:49.288145 master-0 kubenswrapper[7479]: I0308 00:25:49.288123 7479 scope.go:117] "RemoveContainer" containerID="345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec" Mar 08 00:25:49.288503 master-0 kubenswrapper[7479]: E0308 00:25:49.288470 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec\": container with ID starting with 345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec not found: ID does not exist" containerID="345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec" Mar 08 00:25:49.288568 master-0 kubenswrapper[7479]: I0308 00:25:49.288513 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec"} err="failed to get container status \"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec\": rpc error: code = NotFound desc = could not find container \"345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec\": container with ID starting with 345069ea2b561934cab76bb302d8a23a6e9e55551f6cb176569e5310c259abec not found: ID does not exist" Mar 08 00:25:49.288568 master-0 kubenswrapper[7479]: I0308 00:25:49.288540 7479 scope.go:117] "RemoveContainer" containerID="422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967" Mar 08 00:25:49.288862 master-0 kubenswrapper[7479]: E0308 00:25:49.288838 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967\": container with ID starting with 422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967 not found: ID does not exist" containerID="422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967" Mar 08 00:25:49.288927 master-0 kubenswrapper[7479]: I0308 00:25:49.288865 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967"} err="failed to get container status \"422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967\": rpc error: code = NotFound desc = could not find container \"422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967\": container with ID starting with 422de388af3948465d14ca0009d8b790d44fde8b69f70afddfa992982d03c967 not found: ID does not exist" Mar 08 00:25:49.326730 master-0 kubenswrapper[7479]: I0308 00:25:49.326661 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07f9c188-df80-4606-9a21-72228cffa706" (UID: "07f9c188-df80-4606-9a21-72228cffa706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:49.329621 master-0 kubenswrapper[7479]: I0308 00:25:49.329596 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-njqpw" Mar 08 00:25:49.335635 master-0 kubenswrapper[7479]: I0308 00:25:49.335581 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.335635 master-0 kubenswrapper[7479]: I0308 00:25:49.335623 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.336012 master-0 kubenswrapper[7479]: I0308 00:25:49.335659 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvvvn\" (UniqueName: \"kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.336012 master-0 kubenswrapper[7479]: I0308 00:25:49.335707 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.336012 master-0 kubenswrapper[7479]: I0308 00:25:49.335751 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.336012 master-0 kubenswrapper[7479]: I0308 00:25:49.335868 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.336544 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.336751 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.336915 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.336942 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8zb\" (UniqueName: \"kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.336983 7479 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f9c188-df80-4606-9a21-72228cffa706-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.337150 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.337387 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.339663 master-0 kubenswrapper[7479]: I0308 00:25:49.339537 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.342601 master-0 kubenswrapper[7479]: I0308 00:25:49.342567 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.347514 master-0 kubenswrapper[7479]: I0308 00:25:49.347428 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:25:49.352301 master-0 kubenswrapper[7479]: I0308 00:25:49.352272 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8zb\" (UniqueName: \"kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.374658 master-0 kubenswrapper[7479]: I0308 00:25:49.374575 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvvvn\" (UniqueName: \"kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409522 7479 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409588 7479 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409526 7479 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409666 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config podName:d0607e66-703f-4dc9-8aee-bb36c7e0a7b4 nodeName:}" failed. No retries permitted until 2026-03-08 00:25:49.909622297 +0000 UTC m=+266.222531214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config") pod "machine-approver-955fcfb87-rh4g5" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409687 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config podName:d0607e66-703f-4dc9-8aee-bb36c7e0a7b4 nodeName:}" failed. No retries permitted until 2026-03-08 00:25:49.909679228 +0000 UTC m=+266.222588145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config") pod "machine-approver-955fcfb87-rh4g5" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.409922 master-0 kubenswrapper[7479]: E0308 00:25:49.409701 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls podName:d0607e66-703f-4dc9-8aee-bb36c7e0a7b4 nodeName:}" failed. No retries permitted until 2026-03-08 00:25:49.909694439 +0000 UTC m=+266.222603356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls") pod "machine-approver-955fcfb87-rh4g5" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:25:49.417469 master-0 kubenswrapper[7479]: I0308 00:25:49.416550 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q"] Mar 08 00:25:49.427003 master-0 kubenswrapper[7479]: I0308 00:25:49.426484 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:25:49.458433 master-0 kubenswrapper[7479]: I0308 00:25:49.458046 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:25:49.464644 master-0 kubenswrapper[7479]: E0308 00:25:49.464552 7479 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.499069 master-0 kubenswrapper[7479]: I0308 00:25:49.499025 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:25:49.574443 master-0 kubenswrapper[7479]: I0308 00:25:49.574403 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks"] Mar 08 00:25:49.575004 master-0 kubenswrapper[7479]: I0308 00:25:49.574979 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.578927 master-0 kubenswrapper[7479]: I0308 00:25:49.578474 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-v5zml" Mar 08 00:25:49.578927 master-0 kubenswrapper[7479]: I0308 00:25:49.578924 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 00:25:49.598371 master-0 kubenswrapper[7479]: I0308 00:25:49.597027 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks"] Mar 08 00:25:49.604188 master-0 kubenswrapper[7479]: I0308 00:25:49.604122 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:25:49.605814 master-0 kubenswrapper[7479]: I0308 00:25:49.605760 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mr22p"] Mar 08 00:25:49.610335 master-0 kubenswrapper[7479]: I0308 00:25:49.610305 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:25:49.645728 master-0 kubenswrapper[7479]: I0308 00:25:49.641722 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22cn\" (UniqueName: \"kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.645728 master-0 kubenswrapper[7479]: I0308 00:25:49.641787 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.650874 master-0 kubenswrapper[7479]: I0308 00:25:49.650811 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:25:49.651303 master-0 kubenswrapper[7479]: I0308 00:25:49.651240 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lqc4n" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="registry-server" containerID="cri-o://792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03" gracePeriod=2 Mar 08 00:25:49.675402 master-0 kubenswrapper[7479]: I0308 00:25:49.672559 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:25:49.742719 master-0 kubenswrapper[7479]: I0308 00:25:49.742679 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.742930 master-0 kubenswrapper[7479]: I0308 00:25:49.742875 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22cn\" (UniqueName: \"kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.748945 master-0 kubenswrapper[7479]: I0308 00:25:49.748886 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.760572 master-0 kubenswrapper[7479]: I0308 00:25:49.760526 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:25:49.765210 master-0 kubenswrapper[7479]: E0308 00:25:49.765162 7479 projected.go:194] Error preparing data for projected volume kube-api-access-s8lfn for pod openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.765282 master-0 kubenswrapper[7479]: E0308 00:25:49.765258 7479 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn podName:d0607e66-703f-4dc9-8aee-bb36c7e0a7b4 nodeName:}" failed. No retries permitted until 2026-03-08 00:25:50.26523808 +0000 UTC m=+266.578146997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s8lfn" (UniqueName: "kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn") pod "machine-approver-955fcfb87-rh4g5" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:25:49.776805 master-0 kubenswrapper[7479]: I0308 00:25:49.776391 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:25:49.793146 master-0 kubenswrapper[7479]: I0308 00:25:49.793099 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22cn\" (UniqueName: \"kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.868898 master-0 kubenswrapper[7479]: I0308 00:25:49.868797 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q"] Mar 08 00:25:49.879447 master-0 kubenswrapper[7479]: W0308 00:25:49.879393 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode78057cd_5120_4a12_934d_9fed51e1bdc0.slice/crio-c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4 WatchSource:0}: Error finding container c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4: Status 404 returned error can't find the container with id c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4 Mar 08 00:25:49.894772 master-0 kubenswrapper[7479]: I0308 00:25:49.894723 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f9c188-df80-4606-9a21-72228cffa706" path="/var/lib/kubelet/pods/07f9c188-df80-4606-9a21-72228cffa706/volumes" Mar 08 00:25:49.895393 master-0 kubenswrapper[7479]: I0308 00:25:49.895363 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668ffbde-4771-43e1-8f0e-d4b5d17ff693" path="/var/lib/kubelet/pods/668ffbde-4771-43e1-8f0e-d4b5d17ff693/volumes" Mar 08 00:25:49.904860 master-0 kubenswrapper[7479]: I0308 00:25:49.904423 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:25:49.945639 master-0 kubenswrapper[7479]: I0308 00:25:49.945587 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.945639 master-0 kubenswrapper[7479]: I0308 00:25:49.945643 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.946496 master-0 kubenswrapper[7479]: I0308 00:25:49.946462 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.946568 master-0 kubenswrapper[7479]: I0308 00:25:49.946535 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.948140 master-0 kubenswrapper[7479]: I0308 00:25:49.947930 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.952732 master-0 kubenswrapper[7479]: I0308 00:25:49.952659 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:49.971473 master-0 kubenswrapper[7479]: I0308 00:25:49.971412 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv"] Mar 08 00:25:49.972990 master-0 kubenswrapper[7479]: I0308 00:25:49.972963 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:49.975665 master-0 kubenswrapper[7479]: I0308 00:25:49.975627 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:25:49.975729 master-0 kubenswrapper[7479]: I0308 00:25:49.975687 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-7cd6d" Mar 08 00:25:50.003030 master-0 kubenswrapper[7479]: I0308 00:25:50.002026 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9j9zs"] Mar 08 00:25:50.005793 master-0 kubenswrapper[7479]: I0308 00:25:50.005757 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6"] Mar 08 00:25:50.008380 master-0 kubenswrapper[7479]: W0308 00:25:50.008337 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84522c03_fd7b_4be7_9413_84e510b9dc5a.slice/crio-4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f WatchSource:0}: Error finding container 4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f: Status 404 returned error can't find the container with id 4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f Mar 08 00:25:50.008658 master-0 kubenswrapper[7479]: W0308 00:25:50.008632 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2548aca_4a9d_4670_a60a_0d6361d1c441.slice/crio-28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287 WatchSource:0}: Error finding container 28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287: Status 404 returned error can't find the container with id 28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287 Mar 08 00:25:50.044901 master-0 kubenswrapper[7479]: I0308 00:25:50.044867 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:25:50.047738 master-0 kubenswrapper[7479]: I0308 00:25:50.047687 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.047828 master-0 kubenswrapper[7479]: I0308 00:25:50.047764 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.047828 master-0 kubenswrapper[7479]: I0308 00:25:50.047808 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.047924 master-0 kubenswrapper[7479]: I0308 00:25:50.047861 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6w4\" (UniqueName: \"kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.148935 master-0 kubenswrapper[7479]: I0308 00:25:50.148829 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlmg8\" (UniqueName: \"kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8\") pod \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " Mar 08 00:25:50.149141 master-0 kubenswrapper[7479]: I0308 00:25:50.148951 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content\") pod \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " Mar 08 00:25:50.149271 master-0 kubenswrapper[7479]: I0308 00:25:50.149233 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities\") pod \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\" (UID: \"8b94e1ca-5aef-49ae-928e-29cc0ce81d61\") " Mar 08 00:25:50.149528 master-0 kubenswrapper[7479]: I0308 00:25:50.149501 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.149708 master-0 kubenswrapper[7479]: I0308 00:25:50.149684 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.149894 master-0 kubenswrapper[7479]: I0308 00:25:50.149862 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.149938 master-0 kubenswrapper[7479]: I0308 00:25:50.149908 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6w4\" (UniqueName: \"kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.149981 master-0 kubenswrapper[7479]: I0308 00:25:50.149960 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.150061 master-0 kubenswrapper[7479]: I0308 00:25:50.150023 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities" (OuterVolumeSpecName: "utilities") pod "8b94e1ca-5aef-49ae-928e-29cc0ce81d61" (UID: "8b94e1ca-5aef-49ae-928e-29cc0ce81d61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:50.152427 master-0 kubenswrapper[7479]: I0308 00:25:50.152381 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8" (OuterVolumeSpecName: "kube-api-access-hlmg8") pod "8b94e1ca-5aef-49ae-928e-29cc0ce81d61" (UID: "8b94e1ca-5aef-49ae-928e-29cc0ce81d61"). InnerVolumeSpecName "kube-api-access-hlmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:50.152830 master-0 kubenswrapper[7479]: I0308 00:25:50.152798 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.154106 master-0 kubenswrapper[7479]: I0308 00:25:50.154072 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.213747 master-0 kubenswrapper[7479]: I0308 00:25:50.213688 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b94e1ca-5aef-49ae-928e-29cc0ce81d61" (UID: "8b94e1ca-5aef-49ae-928e-29cc0ce81d61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:50.227233 master-0 kubenswrapper[7479]: I0308 00:25:50.227158 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv"] Mar 08 00:25:50.242043 master-0 kubenswrapper[7479]: I0308 00:25:50.241157 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerStarted","Data":"28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287"} Mar 08 00:25:50.242473 master-0 kubenswrapper[7479]: I0308 00:25:50.242372 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f"} Mar 08 00:25:50.244029 master-0 kubenswrapper[7479]: I0308 00:25:50.243978 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"7ded812b6494fa846c4ec3519032a6a79758aaa664ea0250e508e50f52908363"} Mar 08 00:25:50.244084 master-0 kubenswrapper[7479]: I0308 00:25:50.244032 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4"} Mar 08 00:25:50.245256 master-0 kubenswrapper[7479]: I0308 00:25:50.245231 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"d3f16b3080bd84cd315c0103c50c0e4fe4f94ba52854cacca3c3dd9366155a93"} Mar 08 00:25:50.245377 master-0 kubenswrapper[7479]: I0308 00:25:50.245359 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"8ff474153830a652e4ddb7aadf249d8bcfad8aa4e41fc72213e841bb0817ffeb"} Mar 08 00:25:50.250011 master-0 kubenswrapper[7479]: I0308 00:25:50.249978 7479 generic.go:334] "Generic (PLEG): container finished" podID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerID="792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03" exitCode=0 Mar 08 00:25:50.250080 master-0 kubenswrapper[7479]: I0308 00:25:50.250049 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerDied","Data":"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03"} Mar 08 00:25:50.250132 master-0 kubenswrapper[7479]: I0308 00:25:50.250099 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqc4n" event={"ID":"8b94e1ca-5aef-49ae-928e-29cc0ce81d61","Type":"ContainerDied","Data":"fbbedadab3e325405c3103b757378d37ed57beb86fa4dc9dfbd4a453372d9d42"} Mar 08 00:25:50.250132 master-0 kubenswrapper[7479]: I0308 00:25:50.250117 7479 scope.go:117] "RemoveContainer" containerID="792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03" Mar 08 00:25:50.250276 master-0 kubenswrapper[7479]: I0308 00:25:50.250258 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqc4n" Mar 08 00:25:50.251031 master-0 kubenswrapper[7479]: I0308 00:25:50.251001 7479 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:50.251100 master-0 kubenswrapper[7479]: I0308 00:25:50.251032 7479 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:50.251100 master-0 kubenswrapper[7479]: I0308 00:25:50.251046 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlmg8\" (UniqueName: \"kubernetes.io/projected/8b94e1ca-5aef-49ae-928e-29cc0ce81d61-kube-api-access-hlmg8\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:50.293957 master-0 kubenswrapper[7479]: I0308 00:25:50.293906 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9nqqp"] Mar 08 00:25:50.294229 master-0 kubenswrapper[7479]: E0308 00:25:50.294190 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="registry-server" Mar 08 00:25:50.294301 master-0 kubenswrapper[7479]: I0308 00:25:50.294229 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="registry-server" Mar 08 00:25:50.294301 master-0 kubenswrapper[7479]: E0308 00:25:50.294258 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="extract-utilities" Mar 08 00:25:50.294301 master-0 kubenswrapper[7479]: I0308 00:25:50.294265 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="extract-utilities" Mar 08 00:25:50.294301 master-0 kubenswrapper[7479]: E0308 00:25:50.294276 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="extract-content" Mar 08 00:25:50.294301 master-0 kubenswrapper[7479]: I0308 00:25:50.294282 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="extract-content" Mar 08 00:25:50.294503 master-0 kubenswrapper[7479]: I0308 00:25:50.294420 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" containerName="registry-server" Mar 08 00:25:50.295273 master-0 kubenswrapper[7479]: I0308 00:25:50.295250 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.296558 master-0 kubenswrapper[7479]: I0308 00:25:50.296506 7479 scope.go:117] "RemoveContainer" containerID="490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3" Mar 08 00:25:50.296864 master-0 kubenswrapper[7479]: I0308 00:25:50.296836 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4tn2t" Mar 08 00:25:50.327311 master-0 kubenswrapper[7479]: I0308 00:25:50.327278 7479 scope.go:117] "RemoveContainer" containerID="3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb" Mar 08 00:25:50.348190 master-0 kubenswrapper[7479]: I0308 00:25:50.348103 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6w4\" (UniqueName: \"kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.356490 master-0 kubenswrapper[7479]: I0308 00:25:50.356442 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:50.356667 master-0 kubenswrapper[7479]: I0308 00:25:50.356519 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.356667 master-0 kubenswrapper[7479]: I0308 00:25:50.356554 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.356753 master-0 kubenswrapper[7479]: I0308 00:25:50.356725 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxq9\" (UniqueName: \"kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.367375 master-0 kubenswrapper[7479]: I0308 00:25:50.364573 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") pod \"machine-approver-955fcfb87-rh4g5\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:50.409009 master-0 kubenswrapper[7479]: I0308 00:25:50.395129 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nqqp"] Mar 08 00:25:50.409009 master-0 kubenswrapper[7479]: I0308 00:25:50.406830 7479 scope.go:117] "RemoveContainer" containerID="792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03" Mar 08 00:25:50.416054 master-0 kubenswrapper[7479]: E0308 00:25:50.415401 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03\": container with ID starting with 792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03 not found: ID does not exist" containerID="792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03" Mar 08 00:25:50.416054 master-0 kubenswrapper[7479]: I0308 00:25:50.415450 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03"} err="failed to get container status \"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03\": rpc error: code = NotFound desc = could not find container \"792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03\": container with ID starting with 792385f3b070b6699aa94569fbbc4236ccf69daea01ea51c61866317c4985b03 not found: ID does not exist" Mar 08 00:25:50.416054 master-0 kubenswrapper[7479]: I0308 00:25:50.415492 7479 scope.go:117] "RemoveContainer" containerID="490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3" Mar 08 00:25:50.420293 master-0 kubenswrapper[7479]: E0308 00:25:50.417292 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3\": container with ID starting with 490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3 not found: ID does not exist" containerID="490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3" Mar 08 00:25:50.420293 master-0 kubenswrapper[7479]: I0308 00:25:50.417349 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3"} err="failed to get container status \"490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3\": rpc error: code = NotFound desc = could not find container \"490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3\": container with ID starting with 490b7ecde993a5c1e64ebdba9e4f11aa9028720e17a25cf6dd0957ad32e5b9a3 not found: ID does not exist" Mar 08 00:25:50.420293 master-0 kubenswrapper[7479]: I0308 00:25:50.417451 7479 scope.go:117] "RemoveContainer" containerID="3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb" Mar 08 00:25:50.432640 master-0 kubenswrapper[7479]: E0308 00:25:50.422708 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb\": container with ID starting with 3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb not found: ID does not exist" containerID="3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb" Mar 08 00:25:50.432640 master-0 kubenswrapper[7479]: I0308 00:25:50.422771 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb"} err="failed to get container status \"3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb\": rpc error: code = NotFound desc = could not find container \"3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb\": container with ID starting with 3e0f29e2f9929bf5e65bf1550df17f585847678dff9a47a9ff623356d95fb3eb not found: ID does not exist" Mar 08 00:25:50.458722 master-0 kubenswrapper[7479]: I0308 00:25:50.457993 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxq9\" (UniqueName: \"kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.458722 master-0 kubenswrapper[7479]: I0308 00:25:50.458071 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.458722 master-0 kubenswrapper[7479]: I0308 00:25:50.458092 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.458722 master-0 kubenswrapper[7479]: I0308 00:25:50.458611 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.459176 master-0 kubenswrapper[7479]: I0308 00:25:50.459117 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.529805 master-0 kubenswrapper[7479]: I0308 00:25:50.529273 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxq9\" (UniqueName: \"kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.534362 master-0 kubenswrapper[7479]: I0308 00:25:50.534296 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:25:50.539885 master-0 kubenswrapper[7479]: I0308 00:25:50.539852 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lqc4n"] Mar 08 00:25:50.590611 master-0 kubenswrapper[7479]: I0308 00:25:50.590462 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:50.611100 master-0 kubenswrapper[7479]: I0308 00:25:50.610980 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:25:50.626280 master-0 kubenswrapper[7479]: I0308 00:25:50.626187 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:25:50.659027 master-0 kubenswrapper[7479]: W0308 00:25:50.658868 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0607e66_703f_4dc9_8aee_bb36c7e0a7b4.slice/crio-9a5fe78f0b5d57d6ff9e871edd8427e5e35d07b2c99d16979b3d5431015eedb3 WatchSource:0}: Error finding container 9a5fe78f0b5d57d6ff9e871edd8427e5e35d07b2c99d16979b3d5431015eedb3: Status 404 returned error can't find the container with id 9a5fe78f0b5d57d6ff9e871edd8427e5e35d07b2c99d16979b3d5431015eedb3 Mar 08 00:25:50.740860 master-0 kubenswrapper[7479]: I0308 00:25:50.735845 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk"] Mar 08 00:25:50.740860 master-0 kubenswrapper[7479]: I0308 00:25:50.736793 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.741229 master-0 kubenswrapper[7479]: I0308 00:25:50.741079 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk"] Mar 08 00:25:50.748867 master-0 kubenswrapper[7479]: I0308 00:25:50.748827 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:25:50.749063 master-0 kubenswrapper[7479]: I0308 00:25:50.748877 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:25:50.753176 master-0 kubenswrapper[7479]: I0308 00:25:50.749158 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:25:50.753176 master-0 kubenswrapper[7479]: I0308 00:25:50.749252 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:25:50.753176 master-0 kubenswrapper[7479]: I0308 00:25:50.749343 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-l6qr9" Mar 08 00:25:50.753176 master-0 kubenswrapper[7479]: I0308 00:25:50.749944 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:25:50.807937 master-0 kubenswrapper[7479]: I0308 00:25:50.807777 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks"] Mar 08 00:25:50.863449 master-0 kubenswrapper[7479]: I0308 00:25:50.863398 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.863546 master-0 kubenswrapper[7479]: I0308 00:25:50.863450 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.863753 master-0 kubenswrapper[7479]: I0308 00:25:50.863711 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz874\" (UniqueName: \"kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.863800 master-0 kubenswrapper[7479]: I0308 00:25:50.863786 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.904718 master-0 kubenswrapper[7479]: I0308 00:25:50.903299 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj"] Mar 08 00:25:50.905001 master-0 kubenswrapper[7479]: I0308 00:25:50.904806 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:50.907450 master-0 kubenswrapper[7479]: I0308 00:25:50.907416 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-ppd6p" Mar 08 00:25:50.907964 master-0 kubenswrapper[7479]: I0308 00:25:50.907937 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:25:50.908435 master-0 kubenswrapper[7479]: I0308 00:25:50.908394 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:25:50.908703 master-0 kubenswrapper[7479]: I0308 00:25:50.908675 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:25:50.912844 master-0 kubenswrapper[7479]: I0308 00:25:50.912738 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj"] Mar 08 00:25:50.967359 master-0 kubenswrapper[7479]: I0308 00:25:50.967246 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:50.967359 master-0 kubenswrapper[7479]: I0308 00:25:50.967334 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz874\" (UniqueName: \"kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.967359 master-0 kubenswrapper[7479]: I0308 00:25:50.967363 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.967597 master-0 kubenswrapper[7479]: I0308 00:25:50.967384 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.967597 master-0 kubenswrapper[7479]: I0308 00:25:50.967400 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.967597 master-0 kubenswrapper[7479]: I0308 00:25:50.967426 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:50.967597 master-0 kubenswrapper[7479]: I0308 00:25:50.967448 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:50.967597 master-0 kubenswrapper[7479]: I0308 00:25:50.967467 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhx4\" (UniqueName: \"kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:50.968839 master-0 kubenswrapper[7479]: I0308 00:25:50.968819 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.968911 master-0 kubenswrapper[7479]: I0308 00:25:50.968881 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.970232 master-0 kubenswrapper[7479]: I0308 00:25:50.970181 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:50.985336 master-0 kubenswrapper[7479]: I0308 00:25:50.985304 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz874\" (UniqueName: \"kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:51.016974 master-0 kubenswrapper[7479]: I0308 00:25:51.016934 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv"] Mar 08 00:25:51.068449 master-0 kubenswrapper[7479]: I0308 00:25:51.068406 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.068561 master-0 kubenswrapper[7479]: I0308 00:25:51.068461 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.068561 master-0 kubenswrapper[7479]: I0308 00:25:51.068499 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhx4\" (UniqueName: \"kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.068561 master-0 kubenswrapper[7479]: I0308 00:25:51.068548 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.069754 master-0 kubenswrapper[7479]: I0308 00:25:51.069712 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.069908 master-0 kubenswrapper[7479]: I0308 00:25:51.069877 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.073016 master-0 kubenswrapper[7479]: I0308 00:25:51.072994 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.085435 master-0 kubenswrapper[7479]: I0308 00:25:51.085373 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhx4\" (UniqueName: \"kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.090067 master-0 kubenswrapper[7479]: I0308 00:25:51.090019 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:25:51.112337 master-0 kubenswrapper[7479]: I0308 00:25:51.112289 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nqqp"] Mar 08 00:25:51.229296 master-0 kubenswrapper[7479]: I0308 00:25:51.229172 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:25:51.264723 master-0 kubenswrapper[7479]: I0308 00:25:51.264651 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerStarted","Data":"bc8aef14f74b7b8301aa62bf52416d7aecfe942fa89230b452803b210256ff58"} Mar 08 00:25:51.264723 master-0 kubenswrapper[7479]: I0308 00:25:51.264692 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerStarted","Data":"9a5fe78f0b5d57d6ff9e871edd8427e5e35d07b2c99d16979b3d5431015eedb3"} Mar 08 00:25:51.266822 master-0 kubenswrapper[7479]: I0308 00:25:51.266798 7479 generic.go:334] "Generic (PLEG): container finished" podID="b2548aca-4a9d-4670-a60a-0d6361d1c441" containerID="fe58071840dc6349204161e59ca64944f26b1ff66582767c1106a706a17472e1" exitCode=0 Mar 08 00:25:51.266893 master-0 kubenswrapper[7479]: I0308 00:25:51.266853 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerDied","Data":"fe58071840dc6349204161e59ca64944f26b1ff66582767c1106a706a17472e1"} Mar 08 00:25:51.268060 master-0 kubenswrapper[7479]: I0308 00:25:51.268023 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerStarted","Data":"a68be094b9128e17cfcb273f66f3867ebf81ebb395668f57f098ee489c8a0035"} Mar 08 00:25:51.269888 master-0 kubenswrapper[7479]: I0308 00:25:51.269865 7479 generic.go:334] "Generic (PLEG): container finished" podID="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" containerID="0238d925fb5b554e7f8df9102a9ba758748ba0abdd9b4e92ab97dadd2793a34a" exitCode=0 Mar 08 00:25:51.269956 master-0 kubenswrapper[7479]: I0308 00:25:51.269895 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerDied","Data":"0238d925fb5b554e7f8df9102a9ba758748ba0abdd9b4e92ab97dadd2793a34a"} Mar 08 00:25:51.437009 master-0 kubenswrapper[7479]: I0308 00:25:51.436956 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:25:51.437349 master-0 kubenswrapper[7479]: I0308 00:25:51.437296 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4r9ht" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="registry-server" containerID="cri-o://936db645ff2b40de0fcbea2669720f0e2d16e56c3a9987fff0ee1a1cff12a3c2" gracePeriod=2 Mar 08 00:25:51.670252 master-0 kubenswrapper[7479]: W0308 00:25:51.670166 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd70f4efb_e61a_4e88_a271_2f4af21ecdf3.slice/crio-e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29 WatchSource:0}: Error finding container e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29: Status 404 returned error can't find the container with id e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29 Mar 08 00:25:51.687956 master-0 kubenswrapper[7479]: W0308 00:25:51.687905 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e11e7e_6946_4e11_bce9_e91a721fe4a7.slice/crio-7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d WatchSource:0}: Error finding container 7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d: Status 404 returned error can't find the container with id 7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d Mar 08 00:25:51.852998 master-0 kubenswrapper[7479]: I0308 00:25:51.852847 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4fjw9"] Mar 08 00:25:51.855115 master-0 kubenswrapper[7479]: I0308 00:25:51.855061 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:51.858824 master-0 kubenswrapper[7479]: I0308 00:25:51.858720 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bkprm" Mar 08 00:25:51.865349 master-0 kubenswrapper[7479]: I0308 00:25:51.865237 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fjw9"] Mar 08 00:25:51.893301 master-0 kubenswrapper[7479]: I0308 00:25:51.893244 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b94e1ca-5aef-49ae-928e-29cc0ce81d61" path="/var/lib/kubelet/pods/8b94e1ca-5aef-49ae-928e-29cc0ce81d61/volumes" Mar 08 00:25:51.980287 master-0 kubenswrapper[7479]: I0308 00:25:51.979620 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:51.980287 master-0 kubenswrapper[7479]: I0308 00:25:51.979694 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljwf\" (UniqueName: \"kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:51.980287 master-0 kubenswrapper[7479]: I0308 00:25:51.979715 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.081675 master-0 kubenswrapper[7479]: I0308 00:25:52.081596 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.081675 master-0 kubenswrapper[7479]: I0308 00:25:52.081664 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljwf\" (UniqueName: \"kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.082572 master-0 kubenswrapper[7479]: I0308 00:25:52.081795 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.082829 master-0 kubenswrapper[7479]: I0308 00:25:52.082774 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.083800 master-0 kubenswrapper[7479]: I0308 00:25:52.083751 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.096950 master-0 kubenswrapper[7479]: I0308 00:25:52.096903 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljwf\" (UniqueName: \"kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.197016 master-0 kubenswrapper[7479]: I0308 00:25:52.196971 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:25:52.277294 master-0 kubenswrapper[7479]: I0308 00:25:52.277254 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" event={"ID":"d70f4efb-e61a-4e88-a271-2f4af21ecdf3","Type":"ContainerStarted","Data":"e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29"} Mar 08 00:25:52.279064 master-0 kubenswrapper[7479]: I0308 00:25:52.278847 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d"} Mar 08 00:25:52.284174 master-0 kubenswrapper[7479]: I0308 00:25:52.284135 7479 generic.go:334] "Generic (PLEG): container finished" podID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerID="936db645ff2b40de0fcbea2669720f0e2d16e56c3a9987fff0ee1a1cff12a3c2" exitCode=0 Mar 08 00:25:52.284174 master-0 kubenswrapper[7479]: I0308 00:25:52.284173 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerDied","Data":"936db645ff2b40de0fcbea2669720f0e2d16e56c3a9987fff0ee1a1cff12a3c2"} Mar 08 00:25:53.786449 master-0 kubenswrapper[7479]: I0308 00:25:53.786393 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:25:53.904062 master-0 kubenswrapper[7479]: I0308 00:25:53.903972 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhnf7\" (UniqueName: \"kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7\") pod \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " Mar 08 00:25:53.904062 master-0 kubenswrapper[7479]: I0308 00:25:53.904041 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities\") pod \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " Mar 08 00:25:53.904437 master-0 kubenswrapper[7479]: I0308 00:25:53.904099 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content\") pod \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\" (UID: \"6c644b9b-a551-48d2-8f16-e1a6da7d98c9\") " Mar 08 00:25:53.905197 master-0 kubenswrapper[7479]: I0308 00:25:53.904945 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities" (OuterVolumeSpecName: "utilities") pod "6c644b9b-a551-48d2-8f16-e1a6da7d98c9" (UID: "6c644b9b-a551-48d2-8f16-e1a6da7d98c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:53.906367 master-0 kubenswrapper[7479]: I0308 00:25:53.906317 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7" (OuterVolumeSpecName: "kube-api-access-rhnf7") pod "6c644b9b-a551-48d2-8f16-e1a6da7d98c9" (UID: "6c644b9b-a551-48d2-8f16-e1a6da7d98c9"). InnerVolumeSpecName "kube-api-access-rhnf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:53.945185 master-0 kubenswrapper[7479]: I0308 00:25:53.945103 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c644b9b-a551-48d2-8f16-e1a6da7d98c9" (UID: "6c644b9b-a551-48d2-8f16-e1a6da7d98c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:54.006491 master-0 kubenswrapper[7479]: I0308 00:25:54.006367 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhnf7\" (UniqueName: \"kubernetes.io/projected/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-kube-api-access-rhnf7\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:54.006491 master-0 kubenswrapper[7479]: I0308 00:25:54.006414 7479 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:54.006491 master-0 kubenswrapper[7479]: I0308 00:25:54.006429 7479 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c644b9b-a551-48d2-8f16-e1a6da7d98c9-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 00:25:54.298875 master-0 kubenswrapper[7479]: I0308 00:25:54.298763 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4r9ht" event={"ID":"6c644b9b-a551-48d2-8f16-e1a6da7d98c9","Type":"ContainerDied","Data":"6522e09e0271dba6e7e1bcdc92fb3a4714286d0628b2288932b8a0a7d3281419"} Mar 08 00:25:54.298875 master-0 kubenswrapper[7479]: I0308 00:25:54.298834 7479 scope.go:117] "RemoveContainer" containerID="936db645ff2b40de0fcbea2669720f0e2d16e56c3a9987fff0ee1a1cff12a3c2" Mar 08 00:25:54.298875 master-0 kubenswrapper[7479]: I0308 00:25:54.298839 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4r9ht" Mar 08 00:25:54.367278 master-0 kubenswrapper[7479]: I0308 00:25:54.366252 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:25:54.372192 master-0 kubenswrapper[7479]: I0308 00:25:54.372153 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4r9ht"] Mar 08 00:25:54.874883 master-0 kubenswrapper[7479]: I0308 00:25:54.874516 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj"] Mar 08 00:25:54.891613 master-0 kubenswrapper[7479]: I0308 00:25:54.891553 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk"] Mar 08 00:25:54.902402 master-0 kubenswrapper[7479]: I0308 00:25:54.898106 7479 scope.go:117] "RemoveContainer" containerID="d6ae963fd70f7061dfef7c8b6ee26bdbd4f75ddaaff7d7835ce22ba22a0fa9c1" Mar 08 00:25:55.312345 master-0 kubenswrapper[7479]: I0308 00:25:55.312297 7479 generic.go:334] "Generic (PLEG): container finished" podID="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" containerID="ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a" exitCode=0 Mar 08 00:25:55.312564 master-0 kubenswrapper[7479]: I0308 00:25:55.312392 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerDied","Data":"ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a"} Mar 08 00:25:55.313549 master-0 kubenswrapper[7479]: I0308 00:25:55.313520 7479 scope.go:117] "RemoveContainer" containerID="ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a" Mar 08 00:25:55.764283 master-0 kubenswrapper[7479]: W0308 00:25:55.764240 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d810f7f_258a_47ce_9f99_7b1d93388aee.slice/crio-3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7 WatchSource:0}: Error finding container 3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7: Status 404 returned error can't find the container with id 3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7 Mar 08 00:25:55.897464 master-0 kubenswrapper[7479]: I0308 00:25:55.897034 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" path="/var/lib/kubelet/pods/6c644b9b-a551-48d2-8f16-e1a6da7d98c9/volumes" Mar 08 00:25:56.272999 master-0 kubenswrapper[7479]: I0308 00:25:56.272947 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fjw9"] Mar 08 00:25:56.320454 master-0 kubenswrapper[7479]: I0308 00:25:56.320380 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"1cddeda960c60a71faf688d26e861f0212c8666ffc3672e89502d43761b93cd2"} Mar 08 00:25:56.323725 master-0 kubenswrapper[7479]: I0308 00:25:56.322944 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"cceb2895b7ad1a9aea1a615553362ea80d4700a89b8411dc29278d45b0d40f09"} Mar 08 00:25:56.324561 master-0 kubenswrapper[7479]: I0308 00:25:56.324523 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7"} Mar 08 00:25:56.327324 master-0 kubenswrapper[7479]: I0308 00:25:56.327290 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" event={"ID":"d70f4efb-e61a-4e88-a271-2f4af21ecdf3","Type":"ContainerStarted","Data":"e019ef0a1aaa25b302b6691d82feab7cd7bb9ac300d9fa2874c54e4a866f472b"} Mar 08 00:25:56.327582 master-0 kubenswrapper[7479]: I0308 00:25:56.327552 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:56.379640 master-0 kubenswrapper[7479]: I0308 00:25:56.379300 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" podStartSLOduration=7.379265073 podStartE2EDuration="7.379265073s" podCreationTimestamp="2026-03-08 00:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:56.372726797 +0000 UTC m=+272.685635744" watchObservedRunningTime="2026-03-08 00:25:56.379265073 +0000 UTC m=+272.692173990" Mar 08 00:25:56.406808 master-0 kubenswrapper[7479]: I0308 00:25:56.406756 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:25:58.536298 master-0 kubenswrapper[7479]: I0308 00:25:58.536256 7479 scope.go:117] "RemoveContainer" containerID="989c0be29898f604cd52cd2114aa3064cf0c55ea5a9ce0b189962fd1f75c107c" Mar 08 00:25:58.546940 master-0 kubenswrapper[7479]: W0308 00:25:58.545001 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c8d406_5448_4056_ab3c_c8399217c024.slice/crio-ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070 WatchSource:0}: Error finding container ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070: Status 404 returned error can't find the container with id ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070 Mar 08 00:25:59.374472 master-0 kubenswrapper[7479]: I0308 00:25:59.373724 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerStarted","Data":"b19f92b1598dbc89d7fa4f28fc4aac7b76c5f4ec1d5d7efb6ada3eb88a730a6f"} Mar 08 00:25:59.377835 master-0 kubenswrapper[7479]: I0308 00:25:59.376562 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerStarted","Data":"da7f059bc7425c70bc4a951221ce223000707cc405db21efd57cd77b538e3498"} Mar 08 00:25:59.387582 master-0 kubenswrapper[7479]: I0308 00:25:59.384345 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"46b0fd729a946db9b13eac5c57c40b40e4b8a56cd0aeaad608c0b0bcae727675"} Mar 08 00:25:59.387582 master-0 kubenswrapper[7479]: I0308 00:25:59.384390 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"4ade0408e709b8d3bfa126728a922decfde81b90bd3f67b5bee03661da1d8a83"} Mar 08 00:25:59.387582 master-0 kubenswrapper[7479]: I0308 00:25:59.386720 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerStarted","Data":"031c64f86b4914d8ed85469cff79e56b7a2e1cbd518e0fd70f47211192095f45"} Mar 08 00:25:59.388292 master-0 kubenswrapper[7479]: I0308 00:25:59.387847 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"594372803c90fd234334b17b9df7ae74ff21542e2952be96f9e083d29faca78a"} Mar 08 00:25:59.393998 master-0 kubenswrapper[7479]: I0308 00:25:59.393827 7479 generic.go:334] "Generic (PLEG): container finished" podID="55c8d406-5448-4056-ab3c-c8399217c024" containerID="5c5fe88ca84d34535298e53e21f41989f9811c3fb403419a0f79b41f340064f5" exitCode=0 Mar 08 00:25:59.393998 master-0 kubenswrapper[7479]: I0308 00:25:59.393890 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerDied","Data":"5c5fe88ca84d34535298e53e21f41989f9811c3fb403419a0f79b41f340064f5"} Mar 08 00:25:59.393998 master-0 kubenswrapper[7479]: I0308 00:25:59.393914 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerStarted","Data":"ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070"} Mar 08 00:25:59.398915 master-0 kubenswrapper[7479]: I0308 00:25:59.398882 7479 generic.go:334] "Generic (PLEG): container finished" podID="56e11e7e-6946-4e11-bce9-e91a721fe4a7" containerID="cceb2895b7ad1a9aea1a615553362ea80d4700a89b8411dc29278d45b0d40f09" exitCode=0 Mar 08 00:25:59.399004 master-0 kubenswrapper[7479]: I0308 00:25:59.398936 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerDied","Data":"cceb2895b7ad1a9aea1a615553362ea80d4700a89b8411dc29278d45b0d40f09"} Mar 08 00:25:59.411856 master-0 kubenswrapper[7479]: I0308 00:25:59.411806 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"b2272201017b4214b0d3b2d37079305086623f271eb44fd6320c5be45bef2f26"} Mar 08 00:25:59.411995 master-0 kubenswrapper[7479]: I0308 00:25:59.411857 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"1581c52c50b103d88a3f7e59b35292fc1d1154d3b7d7ca2cbb56b6eef1ed3e4b"} Mar 08 00:25:59.428997 master-0 kubenswrapper[7479]: I0308 00:25:59.427362 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerStarted","Data":"788cd6ec7c24f7bc899952e78b4164fdd4945980da6cb205e7a4ac8c582f3eb5"} Mar 08 00:25:59.454998 master-0 kubenswrapper[7479]: I0308 00:25:59.454405 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerStarted","Data":"96340e4adcba39009221d3be0b5592f41b18ec7a6d4f125088b3408673ad95fe"} Mar 08 00:25:59.460669 master-0 kubenswrapper[7479]: I0308 00:25:59.460623 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"55fbbec4f49e2e61889c0fced169d57405e19efe1cb7fb53095eac0414a18aa2"} Mar 08 00:25:59.466272 master-0 kubenswrapper[7479]: I0308 00:25:59.466219 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"34ce99c1480780527cadfa670226036ef9c17ba4caf6288b67da10db8e7da68e"} Mar 08 00:25:59.477936 master-0 kubenswrapper[7479]: I0308 00:25:59.477709 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerStarted","Data":"5a500d2c1f8696d0304f6d90b8b1ba2343bb37980187644821c808366f21e1a3"} Mar 08 00:25:59.484825 master-0 kubenswrapper[7479]: I0308 00:25:59.484773 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerStarted","Data":"f74d256abcdb5398186b869309f30f30a8ba6d7a0454838bd1b4e98ad498b4cd"} Mar 08 00:25:59.495701 master-0 kubenswrapper[7479]: I0308 00:25:59.491125 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"10e105765ad69984ad662df10f70f89fc3258bff9a6fa6179599d2b62b4cdd81"} Mar 08 00:25:59.495701 master-0 kubenswrapper[7479]: I0308 00:25:59.491176 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c"} Mar 08 00:25:59.534059 master-0 kubenswrapper[7479]: I0308 00:25:59.533981 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" podStartSLOduration=9.533960138 podStartE2EDuration="9.533960138s" podCreationTimestamp="2026-03-08 00:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:59.531145305 +0000 UTC m=+275.844054222" watchObservedRunningTime="2026-03-08 00:25:59.533960138 +0000 UTC m=+275.846869055" Mar 08 00:25:59.604727 master-0 kubenswrapper[7479]: I0308 00:25:59.604645 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" podStartSLOduration=5.886869891 podStartE2EDuration="11.604624475s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:49.157518161 +0000 UTC m=+265.470427078" lastFinishedPulling="2026-03-08 00:25:54.875272745 +0000 UTC m=+271.188181662" observedRunningTime="2026-03-08 00:25:59.601988615 +0000 UTC m=+275.914897542" watchObservedRunningTime="2026-03-08 00:25:59.604624475 +0000 UTC m=+275.917533392" Mar 08 00:25:59.684999 master-0 kubenswrapper[7479]: I0308 00:25:59.682371 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" podStartSLOduration=4.216840078 podStartE2EDuration="11.682350984s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:51.077872733 +0000 UTC m=+267.390781650" lastFinishedPulling="2026-03-08 00:25:58.543383639 +0000 UTC m=+274.856292556" observedRunningTime="2026-03-08 00:25:59.647738894 +0000 UTC m=+275.960647821" watchObservedRunningTime="2026-03-08 00:25:59.682350984 +0000 UTC m=+275.995259901" Mar 08 00:25:59.684999 master-0 kubenswrapper[7479]: I0308 00:25:59.683531 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" podStartSLOduration=2.3669240240000002 podStartE2EDuration="10.683524058s" podCreationTimestamp="2026-03-08 00:25:49 +0000 UTC" firstStartedPulling="2026-03-08 00:25:50.232686883 +0000 UTC m=+266.545595800" lastFinishedPulling="2026-03-08 00:25:58.549286917 +0000 UTC m=+274.862195834" observedRunningTime="2026-03-08 00:25:59.681323182 +0000 UTC m=+275.994232099" watchObservedRunningTime="2026-03-08 00:25:59.683524058 +0000 UTC m=+275.996432975" Mar 08 00:25:59.727221 master-0 kubenswrapper[7479]: I0308 00:25:59.720155 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6t5lg" podStartSLOduration=6.106110486 podStartE2EDuration="12.720134741s" podCreationTimestamp="2026-03-08 00:25:47 +0000 UTC" firstStartedPulling="2026-03-08 00:25:49.206695726 +0000 UTC m=+265.519604643" lastFinishedPulling="2026-03-08 00:25:55.820719981 +0000 UTC m=+272.133628898" observedRunningTime="2026-03-08 00:25:59.709519049 +0000 UTC m=+276.022427966" watchObservedRunningTime="2026-03-08 00:25:59.720134741 +0000 UTC m=+276.033043658" Mar 08 00:25:59.739487 master-0 kubenswrapper[7479]: I0308 00:25:59.737429 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" podStartSLOduration=3.319586298 podStartE2EDuration="11.737410501s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:50.248730601 +0000 UTC m=+266.561639518" lastFinishedPulling="2026-03-08 00:25:58.666554764 +0000 UTC m=+274.979463721" observedRunningTime="2026-03-08 00:25:59.735040954 +0000 UTC m=+276.047949871" watchObservedRunningTime="2026-03-08 00:25:59.737410501 +0000 UTC m=+276.050319418" Mar 08 00:25:59.794026 master-0 kubenswrapper[7479]: I0308 00:25:59.793924 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" podStartSLOduration=6.585622682 podStartE2EDuration="11.793886395s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:49.666432635 +0000 UTC m=+265.979341552" lastFinishedPulling="2026-03-08 00:25:54.874696348 +0000 UTC m=+271.187605265" observedRunningTime="2026-03-08 00:25:59.788702445 +0000 UTC m=+276.101611362" watchObservedRunningTime="2026-03-08 00:25:59.793886395 +0000 UTC m=+276.106795312" Mar 08 00:25:59.819692 master-0 kubenswrapper[7479]: I0308 00:25:59.818703 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" podStartSLOduration=6.118761698 podStartE2EDuration="11.818679121s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:49.174239179 +0000 UTC m=+265.487148096" lastFinishedPulling="2026-03-08 00:25:54.874156562 +0000 UTC m=+271.187065519" observedRunningTime="2026-03-08 00:25:59.815460734 +0000 UTC m=+276.128369661" watchObservedRunningTime="2026-03-08 00:25:59.818679121 +0000 UTC m=+276.131588038" Mar 08 00:25:59.854610 master-0 kubenswrapper[7479]: I0308 00:25:59.854534 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" podStartSLOduration=3.06216482 podStartE2EDuration="10.854500116s" podCreationTimestamp="2026-03-08 00:25:49 +0000 UTC" firstStartedPulling="2026-03-08 00:25:50.849690004 +0000 UTC m=+267.162598921" lastFinishedPulling="2026-03-08 00:25:58.64202527 +0000 UTC m=+274.954934217" observedRunningTime="2026-03-08 00:25:59.854245133 +0000 UTC m=+276.167154080" watchObservedRunningTime="2026-03-08 00:25:59.854500116 +0000 UTC m=+276.167409043" Mar 08 00:26:00.499894 master-0 kubenswrapper[7479]: I0308 00:26:00.499850 7479 generic.go:334] "Generic (PLEG): container finished" podID="b2548aca-4a9d-4670-a60a-0d6361d1c441" containerID="031c64f86b4914d8ed85469cff79e56b7a2e1cbd518e0fd70f47211192095f45" exitCode=0 Mar 08 00:26:00.500064 master-0 kubenswrapper[7479]: I0308 00:26:00.499935 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerDied","Data":"031c64f86b4914d8ed85469cff79e56b7a2e1cbd518e0fd70f47211192095f45"} Mar 08 00:26:00.502867 master-0 kubenswrapper[7479]: I0308 00:26:00.502627 7479 generic.go:334] "Generic (PLEG): container finished" podID="55c8d406-5448-4056-ab3c-c8399217c024" containerID="f1165833632b857988bef725397f89c163ab44ca5ba27c1f2f567224751fe8ad" exitCode=0 Mar 08 00:26:00.502958 master-0 kubenswrapper[7479]: I0308 00:26:00.502894 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerDied","Data":"f1165833632b857988bef725397f89c163ab44ca5ba27c1f2f567224751fe8ad"} Mar 08 00:26:00.505814 master-0 kubenswrapper[7479]: I0308 00:26:00.504889 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"819bab5050551748fadc71568c0e7c229f38c2b2cb38e16a3bd09395d5299f4e"} Mar 08 00:26:00.510750 master-0 kubenswrapper[7479]: I0308 00:26:00.510665 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerStarted","Data":"2c55e9027e9db2cb3df9959e5475f9fd769e23cb7ecb353d1a2f6789fe41833c"} Mar 08 00:26:00.510750 master-0 kubenswrapper[7479]: I0308 00:26:00.510702 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerStarted","Data":"448ae8ff53f7646a273cdf09b220fc2247ebe60a128d876e614d7cb7d241e38b"} Mar 08 00:26:00.577976 master-0 kubenswrapper[7479]: I0308 00:26:00.577912 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" podStartSLOduration=6.549267805 podStartE2EDuration="12.577894624s" podCreationTimestamp="2026-03-08 00:25:48 +0000 UTC" firstStartedPulling="2026-03-08 00:25:48.846470243 +0000 UTC m=+265.159379160" lastFinishedPulling="2026-03-08 00:25:54.875097022 +0000 UTC m=+271.188005979" observedRunningTime="2026-03-08 00:26:00.576833032 +0000 UTC m=+276.889741949" watchObservedRunningTime="2026-03-08 00:26:00.577894624 +0000 UTC m=+276.890803541" Mar 08 00:26:01.520671 master-0 kubenswrapper[7479]: I0308 00:26:01.520573 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerStarted","Data":"dac2b4107815aa7d9649b2815ef78f301ab7916075e5059aa3a49b2c981a36fe"} Mar 08 00:26:01.533595 master-0 kubenswrapper[7479]: I0308 00:26:01.533499 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerStarted","Data":"bb5c6100970e1f98de7541d0e14fa48c4311bd4754ce859444be673afbee41d8"} Mar 08 00:26:01.536448 master-0 kubenswrapper[7479]: I0308 00:26:01.536405 7479 generic.go:334] "Generic (PLEG): container finished" podID="56e11e7e-6946-4e11-bce9-e91a721fe4a7" containerID="819bab5050551748fadc71568c0e7c229f38c2b2cb38e16a3bd09395d5299f4e" exitCode=0 Mar 08 00:26:01.536573 master-0 kubenswrapper[7479]: I0308 00:26:01.536510 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerDied","Data":"819bab5050551748fadc71568c0e7c229f38c2b2cb38e16a3bd09395d5299f4e"} Mar 08 00:26:01.555847 master-0 kubenswrapper[7479]: I0308 00:26:01.554808 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9j9zs" podStartSLOduration=3.308028638 podStartE2EDuration="12.554782046s" podCreationTimestamp="2026-03-08 00:25:49 +0000 UTC" firstStartedPulling="2026-03-08 00:25:51.662565541 +0000 UTC m=+267.975474458" lastFinishedPulling="2026-03-08 00:26:00.909318949 +0000 UTC m=+277.222227866" observedRunningTime="2026-03-08 00:26:01.552811583 +0000 UTC m=+277.865720540" watchObservedRunningTime="2026-03-08 00:26:01.554782046 +0000 UTC m=+277.867690963" Mar 08 00:26:01.601288 master-0 kubenswrapper[7479]: I0308 00:26:01.601220 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4fjw9" podStartSLOduration=8.903575024 podStartE2EDuration="10.601186053s" podCreationTimestamp="2026-03-08 00:25:51 +0000 UTC" firstStartedPulling="2026-03-08 00:25:59.396978513 +0000 UTC m=+275.709887430" lastFinishedPulling="2026-03-08 00:26:01.094589532 +0000 UTC m=+277.407498459" observedRunningTime="2026-03-08 00:26:01.599875488 +0000 UTC m=+277.912784425" watchObservedRunningTime="2026-03-08 00:26:01.601186053 +0000 UTC m=+277.914094970" Mar 08 00:26:02.197372 master-0 kubenswrapper[7479]: I0308 00:26:02.197228 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:26:02.197660 master-0 kubenswrapper[7479]: I0308 00:26:02.197616 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:26:02.377840 master-0 kubenswrapper[7479]: I0308 00:26:02.377785 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-k7pnc"] Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: E0308 00:26:02.378033 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="registry-server" Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: I0308 00:26:02.378051 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="registry-server" Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: E0308 00:26:02.378061 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="extract-utilities" Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: I0308 00:26:02.378069 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="extract-utilities" Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: E0308 00:26:02.378080 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="extract-content" Mar 08 00:26:02.378109 master-0 kubenswrapper[7479]: I0308 00:26:02.378086 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="extract-content" Mar 08 00:26:02.378381 master-0 kubenswrapper[7479]: I0308 00:26:02.378190 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c644b9b-a551-48d2-8f16-e1a6da7d98c9" containerName="registry-server" Mar 08 00:26:02.378766 master-0 kubenswrapper[7479]: I0308 00:26:02.378730 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.382990 master-0 kubenswrapper[7479]: I0308 00:26:02.382960 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:26:02.383163 master-0 kubenswrapper[7479]: I0308 00:26:02.383149 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-wlrqc" Mar 08 00:26:02.435394 master-0 kubenswrapper[7479]: I0308 00:26:02.435348 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.435394 master-0 kubenswrapper[7479]: I0308 00:26:02.435402 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggmz\" (UniqueName: \"kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.435815 master-0 kubenswrapper[7479]: I0308 00:26:02.435427 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.435815 master-0 kubenswrapper[7479]: I0308 00:26:02.435450 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.536679 master-0 kubenswrapper[7479]: I0308 00:26:02.536631 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggmz\" (UniqueName: \"kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.537128 master-0 kubenswrapper[7479]: I0308 00:26:02.536694 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.537128 master-0 kubenswrapper[7479]: I0308 00:26:02.536734 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.537128 master-0 kubenswrapper[7479]: I0308 00:26:02.536825 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.537128 master-0 kubenswrapper[7479]: I0308 00:26:02.536911 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.537999 master-0 kubenswrapper[7479]: I0308 00:26:02.537980 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.540994 master-0 kubenswrapper[7479]: I0308 00:26:02.540950 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.547705 master-0 kubenswrapper[7479]: I0308 00:26:02.547657 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"384c65ce883105e112d84de0e43a4a493c36b10bc529d9576a7501903ba90ca3"} Mar 08 00:26:02.553615 master-0 kubenswrapper[7479]: I0308 00:26:02.553548 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggmz\" (UniqueName: \"kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.572596 master-0 kubenswrapper[7479]: I0308 00:26:02.572455 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9nqqp" podStartSLOduration=9.674743186 podStartE2EDuration="12.572398278s" podCreationTimestamp="2026-03-08 00:25:50 +0000 UTC" firstStartedPulling="2026-03-08 00:25:59.400441213 +0000 UTC m=+275.713350130" lastFinishedPulling="2026-03-08 00:26:02.298096295 +0000 UTC m=+278.611005222" observedRunningTime="2026-03-08 00:26:02.565994214 +0000 UTC m=+278.878903131" watchObservedRunningTime="2026-03-08 00:26:02.572398278 +0000 UTC m=+278.885307205" Mar 08 00:26:02.723603 master-0 kubenswrapper[7479]: I0308 00:26:02.723153 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:26:02.745777 master-0 kubenswrapper[7479]: W0308 00:26:02.745689 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce9b1b97_d4f1_4e1f_9a96_8b67c3fd84f7.slice/crio-60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f WatchSource:0}: Error finding container 60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f: Status 404 returned error can't find the container with id 60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f Mar 08 00:26:03.242585 master-0 kubenswrapper[7479]: I0308 00:26:03.242495 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-4fjw9" podUID="55c8d406-5448-4056-ab3c-c8399217c024" containerName="registry-server" probeResult="failure" output=< Mar 08 00:26:03.242585 master-0 kubenswrapper[7479]: timeout: failed to connect service ":50051" within 1s Mar 08 00:26:03.242585 master-0 kubenswrapper[7479]: > Mar 08 00:26:03.555858 master-0 kubenswrapper[7479]: I0308 00:26:03.555802 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"881492ede708564c2b50f2504981788dae1af5d233f3feb7510c408faa94d0fe"} Mar 08 00:26:03.555858 master-0 kubenswrapper[7479]: I0308 00:26:03.555857 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"7ab4fa4e971789d5db1c529b4678cdec74ff9e32562173d88e9c894bbbe80a3b"} Mar 08 00:26:03.555858 master-0 kubenswrapper[7479]: I0308 00:26:03.555869 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f"} Mar 08 00:26:03.576918 master-0 kubenswrapper[7479]: I0308 00:26:03.576553 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" podStartSLOduration=1.576532424 podStartE2EDuration="1.576532424s" podCreationTimestamp="2026-03-08 00:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:03.574648952 +0000 UTC m=+279.887557879" watchObservedRunningTime="2026-03-08 00:26:03.576532424 +0000 UTC m=+279.889441341" Mar 08 00:26:04.291361 master-0 kubenswrapper[7479]: I0308 00:26:04.291305 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5"] Mar 08 00:26:04.293244 master-0 kubenswrapper[7479]: I0308 00:26:04.291576 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="machine-approver-controller" containerID="cri-o://788cd6ec7c24f7bc899952e78b4164fdd4945980da6cb205e7a4ac8c582f3eb5" gracePeriod=30 Mar 08 00:26:04.293244 master-0 kubenswrapper[7479]: I0308 00:26:04.292591 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="kube-rbac-proxy" containerID="cri-o://bc8aef14f74b7b8301aa62bf52416d7aecfe942fa89230b452803b210256ff58" gracePeriod=30 Mar 08 00:26:04.569538 master-0 kubenswrapper[7479]: I0308 00:26:04.569468 7479 generic.go:334] "Generic (PLEG): container finished" podID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerID="788cd6ec7c24f7bc899952e78b4164fdd4945980da6cb205e7a4ac8c582f3eb5" exitCode=0 Mar 08 00:26:04.569538 master-0 kubenswrapper[7479]: I0308 00:26:04.569504 7479 generic.go:334] "Generic (PLEG): container finished" podID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerID="bc8aef14f74b7b8301aa62bf52416d7aecfe942fa89230b452803b210256ff58" exitCode=0 Mar 08 00:26:04.569538 master-0 kubenswrapper[7479]: I0308 00:26:04.569541 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerDied","Data":"788cd6ec7c24f7bc899952e78b4164fdd4945980da6cb205e7a4ac8c582f3eb5"} Mar 08 00:26:04.572103 master-0 kubenswrapper[7479]: I0308 00:26:04.569594 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerDied","Data":"bc8aef14f74b7b8301aa62bf52416d7aecfe942fa89230b452803b210256ff58"} Mar 08 00:26:06.895430 master-0 kubenswrapper[7479]: I0308 00:26:06.895364 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp"] Mar 08 00:26:06.896367 master-0 kubenswrapper[7479]: I0308 00:26:06.896340 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:06.899472 master-0 kubenswrapper[7479]: I0308 00:26:06.899086 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:26:06.899472 master-0 kubenswrapper[7479]: I0308 00:26:06.899099 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp"] Mar 08 00:26:06.899472 master-0 kubenswrapper[7479]: I0308 00:26:06.899174 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-qvcg8" Mar 08 00:26:06.999634 master-0 kubenswrapper[7479]: I0308 00:26:06.999567 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:06.999847 master-0 kubenswrapper[7479]: I0308 00:26:06.999651 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65c2\" (UniqueName: \"kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:06.999847 master-0 kubenswrapper[7479]: I0308 00:26:06.999722 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.101514 master-0 kubenswrapper[7479]: I0308 00:26:07.100833 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65c2\" (UniqueName: \"kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.101514 master-0 kubenswrapper[7479]: I0308 00:26:07.100915 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.101514 master-0 kubenswrapper[7479]: I0308 00:26:07.100947 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.102002 master-0 kubenswrapper[7479]: I0308 00:26:07.101979 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.106544 master-0 kubenswrapper[7479]: I0308 00:26:07.106511 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.118919 master-0 kubenswrapper[7479]: I0308 00:26:07.118778 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65c2\" (UniqueName: \"kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.222888 master-0 kubenswrapper[7479]: I0308 00:26:07.222859 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:26:07.226670 master-0 kubenswrapper[7479]: I0308 00:26:07.226419 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:26:07.303757 master-0 kubenswrapper[7479]: I0308 00:26:07.303715 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") pod \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " Mar 08 00:26:07.303862 master-0 kubenswrapper[7479]: I0308 00:26:07.303840 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") pod \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " Mar 08 00:26:07.303923 master-0 kubenswrapper[7479]: I0308 00:26:07.303879 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") pod \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " Mar 08 00:26:07.303963 master-0 kubenswrapper[7479]: I0308 00:26:07.303935 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") pod \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\" (UID: \"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4\") " Mar 08 00:26:07.304703 master-0 kubenswrapper[7479]: I0308 00:26:07.304559 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:26:07.304917 master-0 kubenswrapper[7479]: I0308 00:26:07.304890 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config" (OuterVolumeSpecName: "config") pod "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:26:07.311650 master-0 kubenswrapper[7479]: I0308 00:26:07.310808 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn" (OuterVolumeSpecName: "kube-api-access-s8lfn") pod "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4"). InnerVolumeSpecName "kube-api-access-s8lfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:26:07.311650 master-0 kubenswrapper[7479]: I0308 00:26:07.311036 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" (UID: "d0607e66-703f-4dc9-8aee-bb36c7e0a7b4"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:26:07.405041 master-0 kubenswrapper[7479]: I0308 00:26:07.405003 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:07.405041 master-0 kubenswrapper[7479]: I0308 00:26:07.405035 7479 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:07.405041 master-0 kubenswrapper[7479]: I0308 00:26:07.405046 7479 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:07.405321 master-0 kubenswrapper[7479]: I0308 00:26:07.405055 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8lfn\" (UniqueName: \"kubernetes.io/projected/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4-kube-api-access-s8lfn\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:07.586986 master-0 kubenswrapper[7479]: I0308 00:26:07.586887 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" event={"ID":"d0607e66-703f-4dc9-8aee-bb36c7e0a7b4","Type":"ContainerDied","Data":"9a5fe78f0b5d57d6ff9e871edd8427e5e35d07b2c99d16979b3d5431015eedb3"} Mar 08 00:26:07.586986 master-0 kubenswrapper[7479]: I0308 00:26:07.586965 7479 scope.go:117] "RemoveContainer" containerID="788cd6ec7c24f7bc899952e78b4164fdd4945980da6cb205e7a4ac8c582f3eb5" Mar 08 00:26:07.586986 master-0 kubenswrapper[7479]: I0308 00:26:07.586990 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5" Mar 08 00:26:07.589800 master-0 kubenswrapper[7479]: I0308 00:26:07.589752 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"0bfb5bceaa149162c15931fa6c19adc19bff0abfffe5914519da3718cfa8c3bf"} Mar 08 00:26:07.599087 master-0 kubenswrapper[7479]: I0308 00:26:07.599039 7479 scope.go:117] "RemoveContainer" containerID="bc8aef14f74b7b8301aa62bf52416d7aecfe942fa89230b452803b210256ff58" Mar 08 00:26:07.652196 master-0 kubenswrapper[7479]: I0308 00:26:07.652076 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" podStartSLOduration=9.303738765 podStartE2EDuration="17.652043951s" podCreationTimestamp="2026-03-08 00:25:50 +0000 UTC" firstStartedPulling="2026-03-08 00:25:58.952838786 +0000 UTC m=+275.265747713" lastFinishedPulling="2026-03-08 00:26:07.301143982 +0000 UTC m=+283.614052899" observedRunningTime="2026-03-08 00:26:07.620531557 +0000 UTC m=+283.933440514" watchObservedRunningTime="2026-03-08 00:26:07.652043951 +0000 UTC m=+283.964952908" Mar 08 00:26:07.657683 master-0 kubenswrapper[7479]: I0308 00:26:07.656939 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5"] Mar 08 00:26:07.668596 master-0 kubenswrapper[7479]: I0308 00:26:07.668343 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-rh4g5"] Mar 08 00:26:07.675393 master-0 kubenswrapper[7479]: I0308 00:26:07.675321 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp"] Mar 08 00:26:07.697354 master-0 kubenswrapper[7479]: I0308 00:26:07.697243 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b"] Mar 08 00:26:07.697481 master-0 kubenswrapper[7479]: E0308 00:26:07.697467 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="kube-rbac-proxy" Mar 08 00:26:07.697481 master-0 kubenswrapper[7479]: I0308 00:26:07.697479 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="kube-rbac-proxy" Mar 08 00:26:07.697566 master-0 kubenswrapper[7479]: E0308 00:26:07.697495 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="machine-approver-controller" Mar 08 00:26:07.697566 master-0 kubenswrapper[7479]: I0308 00:26:07.697503 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="machine-approver-controller" Mar 08 00:26:07.698240 master-0 kubenswrapper[7479]: I0308 00:26:07.698191 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="kube-rbac-proxy" Mar 08 00:26:07.698322 master-0 kubenswrapper[7479]: I0308 00:26:07.698275 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" containerName="machine-approver-controller" Mar 08 00:26:07.698962 master-0 kubenswrapper[7479]: I0308 00:26:07.698926 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.700954 master-0 kubenswrapper[7479]: I0308 00:26:07.700884 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:26:07.701785 master-0 kubenswrapper[7479]: I0308 00:26:07.701388 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-njqpw" Mar 08 00:26:07.701785 master-0 kubenswrapper[7479]: I0308 00:26:07.701526 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:26:07.701785 master-0 kubenswrapper[7479]: I0308 00:26:07.701560 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:26:07.701785 master-0 kubenswrapper[7479]: I0308 00:26:07.701692 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:26:07.702702 master-0 kubenswrapper[7479]: I0308 00:26:07.702680 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:26:07.811025 master-0 kubenswrapper[7479]: I0308 00:26:07.810629 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.811025 master-0 kubenswrapper[7479]: I0308 00:26:07.810681 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.811025 master-0 kubenswrapper[7479]: I0308 00:26:07.810721 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6xw\" (UniqueName: \"kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.811025 master-0 kubenswrapper[7479]: I0308 00:26:07.810769 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.892820 master-0 kubenswrapper[7479]: I0308 00:26:07.891737 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0607e66-703f-4dc9-8aee-bb36c7e0a7b4" path="/var/lib/kubelet/pods/d0607e66-703f-4dc9-8aee-bb36c7e0a7b4/volumes" Mar 08 00:26:07.912223 master-0 kubenswrapper[7479]: I0308 00:26:07.912158 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.912742 master-0 kubenswrapper[7479]: I0308 00:26:07.912722 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.912867 master-0 kubenswrapper[7479]: I0308 00:26:07.912853 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6xw\" (UniqueName: \"kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.912989 master-0 kubenswrapper[7479]: I0308 00:26:07.912976 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.916223 master-0 kubenswrapper[7479]: I0308 00:26:07.913802 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.916223 master-0 kubenswrapper[7479]: I0308 00:26:07.914011 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.916223 master-0 kubenswrapper[7479]: I0308 00:26:07.915324 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:07.933100 master-0 kubenswrapper[7479]: I0308 00:26:07.933057 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6xw\" (UniqueName: \"kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:08.008382 master-0 kubenswrapper[7479]: I0308 00:26:08.006766 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx"] Mar 08 00:26:08.008382 master-0 kubenswrapper[7479]: I0308 00:26:08.008270 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:08.011961 master-0 kubenswrapper[7479]: I0308 00:26:08.011921 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 00:26:08.013875 master-0 kubenswrapper[7479]: I0308 00:26:08.012824 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-r6nkv"] Mar 08 00:26:08.013875 master-0 kubenswrapper[7479]: I0308 00:26:08.013639 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.015839 master-0 kubenswrapper[7479]: I0308 00:26:08.015792 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:26:08.016107 master-0 kubenswrapper[7479]: I0308 00:26:08.016062 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:26:08.016181 master-0 kubenswrapper[7479]: I0308 00:26:08.016165 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:26:08.016316 master-0 kubenswrapper[7479]: I0308 00:26:08.016280 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:26:08.016358 master-0 kubenswrapper[7479]: I0308 00:26:08.016315 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:26:08.016423 master-0 kubenswrapper[7479]: I0308 00:26:08.016389 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9"] Mar 08 00:26:08.016622 master-0 kubenswrapper[7479]: I0308 00:26:08.016402 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:26:08.017599 master-0 kubenswrapper[7479]: I0308 00:26:08.017565 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:26:08.021026 master-0 kubenswrapper[7479]: I0308 00:26:08.020981 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:26:08.025061 master-0 kubenswrapper[7479]: I0308 00:26:08.024997 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9"] Mar 08 00:26:08.050795 master-0 kubenswrapper[7479]: I0308 00:26:08.050764 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx"] Mar 08 00:26:08.117023 master-0 kubenswrapper[7479]: I0308 00:26:08.116970 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.117023 master-0 kubenswrapper[7479]: I0308 00:26:08.117020 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.117392 master-0 kubenswrapper[7479]: I0308 00:26:08.117060 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-st8tx\" (UID: \"0d0cb126-341c-4215-ad2e-a008193cc0b5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:08.117392 master-0 kubenswrapper[7479]: I0308 00:26:08.117081 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.117392 master-0 kubenswrapper[7479]: I0308 00:26:08.117109 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvglb\" (UniqueName: \"kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb\") pod \"network-check-source-7c67b67d47-sctv9\" (UID: \"786e30f1-d30a-43e1-85cb-d8ea1495422e\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:26:08.117392 master-0 kubenswrapper[7479]: I0308 00:26:08.117130 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9fv4\" (UniqueName: \"kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.117392 master-0 kubenswrapper[7479]: I0308 00:26:08.117148 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.218846 master-0 kubenswrapper[7479]: I0308 00:26:08.218769 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.219010 master-0 kubenswrapper[7479]: I0308 00:26:08.218961 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.219099 master-0 kubenswrapper[7479]: I0308 00:26:08.219075 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-st8tx\" (UID: \"0d0cb126-341c-4215-ad2e-a008193cc0b5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:08.219151 master-0 kubenswrapper[7479]: I0308 00:26:08.219117 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.219186 master-0 kubenswrapper[7479]: I0308 00:26:08.219159 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvglb\" (UniqueName: \"kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb\") pod \"network-check-source-7c67b67d47-sctv9\" (UID: \"786e30f1-d30a-43e1-85cb-d8ea1495422e\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:26:08.219232 master-0 kubenswrapper[7479]: I0308 00:26:08.219189 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9fv4\" (UniqueName: \"kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.219263 master-0 kubenswrapper[7479]: I0308 00:26:08.219242 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.220050 master-0 kubenswrapper[7479]: I0308 00:26:08.220030 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.222411 master-0 kubenswrapper[7479]: I0308 00:26:08.222286 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-st8tx\" (UID: \"0d0cb126-341c-4215-ad2e-a008193cc0b5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:08.222411 master-0 kubenswrapper[7479]: I0308 00:26:08.222322 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.226088 master-0 kubenswrapper[7479]: I0308 00:26:08.223345 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.226088 master-0 kubenswrapper[7479]: I0308 00:26:08.223450 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.234582 master-0 kubenswrapper[7479]: I0308 00:26:08.234542 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9fv4\" (UniqueName: \"kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.236563 master-0 kubenswrapper[7479]: I0308 00:26:08.236525 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvglb\" (UniqueName: \"kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb\") pod \"network-check-source-7c67b67d47-sctv9\" (UID: \"786e30f1-d30a-43e1-85cb-d8ea1495422e\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:26:08.333825 master-0 kubenswrapper[7479]: I0308 00:26:08.333749 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:26:08.333825 master-0 kubenswrapper[7479]: I0308 00:26:08.333814 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:26:08.347864 master-0 kubenswrapper[7479]: I0308 00:26:08.347754 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:08.363950 master-0 kubenswrapper[7479]: I0308 00:26:08.363906 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:08.373029 master-0 kubenswrapper[7479]: I0308 00:26:08.372960 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:26:08.376688 master-0 kubenswrapper[7479]: I0308 00:26:08.375678 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:26:08.598572 master-0 kubenswrapper[7479]: I0308 00:26:08.598525 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"4bf845493478fab338d4b9ab87cadf5b607d6c9eebb501f29c76a34495978f4a"} Mar 08 00:26:08.598572 master-0 kubenswrapper[7479]: I0308 00:26:08.598572 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"b6fa88efbe7764411e628b9931e04b59a0f6aad2f1656156d14674b5a960082d"} Mar 08 00:26:08.598572 master-0 kubenswrapper[7479]: I0308 00:26:08.598583 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"f3cab32904f1f3dc9eae1dc7b47ec8d51b63661baeb9517ad66b59248d52dfef"} Mar 08 00:26:08.607280 master-0 kubenswrapper[7479]: I0308 00:26:08.606630 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"e06e3b0cd0c498549672bad1fef5caf7eaac361c9fc1607113d2582022a9ec7a"} Mar 08 00:26:08.607280 master-0 kubenswrapper[7479]: I0308 00:26:08.606673 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"849f8c9c0130860af59ecc5126efd43b717473a9bed214260e499c901acfe39b"} Mar 08 00:26:08.607280 master-0 kubenswrapper[7479]: I0308 00:26:08.606683 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"1cc242574263ef7c849076452db10d6f32fa75aeb983a9e0f9150bc85db0911a"} Mar 08 00:26:08.608527 master-0 kubenswrapper[7479]: I0308 00:26:08.608471 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"f860ea80aed55d2d8aefcd014e94c8e07b481ea1bac54429f957dafad3d193dc"} Mar 08 00:26:08.616410 master-0 kubenswrapper[7479]: I0308 00:26:08.615339 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" podStartSLOduration=1.6153233249999999 podStartE2EDuration="1.615323325s" podCreationTimestamp="2026-03-08 00:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:08.612507083 +0000 UTC m=+284.925416000" watchObservedRunningTime="2026-03-08 00:26:08.615323325 +0000 UTC m=+284.928232242" Mar 08 00:26:08.631113 master-0 kubenswrapper[7479]: I0308 00:26:08.631038 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" podStartSLOduration=2.631018177 podStartE2EDuration="2.631018177s" podCreationTimestamp="2026-03-08 00:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:08.628810731 +0000 UTC m=+284.941719648" watchObservedRunningTime="2026-03-08 00:26:08.631018177 +0000 UTC m=+284.943927084" Mar 08 00:26:08.651252 master-0 kubenswrapper[7479]: I0308 00:26:08.650979 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:26:08.787482 master-0 kubenswrapper[7479]: I0308 00:26:08.787114 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx"] Mar 08 00:26:08.800605 master-0 kubenswrapper[7479]: W0308 00:26:08.800031 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d0cb126_341c_4215_ad2e_a008193cc0b5.slice/crio-27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f WatchSource:0}: Error finding container 27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f: Status 404 returned error can't find the container with id 27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f Mar 08 00:26:08.844042 master-0 kubenswrapper[7479]: I0308 00:26:08.843964 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9"] Mar 08 00:26:08.851463 master-0 kubenswrapper[7479]: W0308 00:26:08.851400 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786e30f1_d30a_43e1_85cb_d8ea1495422e.slice/crio-8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced WatchSource:0}: Error finding container 8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced: Status 404 returned error can't find the container with id 8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced Mar 08 00:26:08.913448 master-0 kubenswrapper[7479]: I0308 00:26:08.913391 7479 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:26:09.460067 master-0 kubenswrapper[7479]: I0308 00:26:09.459228 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:26:09.460067 master-0 kubenswrapper[7479]: I0308 00:26:09.459273 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:26:09.505815 master-0 kubenswrapper[7479]: I0308 00:26:09.505752 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:26:09.617683 master-0 kubenswrapper[7479]: I0308 00:26:09.617606 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" event={"ID":"0d0cb126-341c-4215-ad2e-a008193cc0b5","Type":"ContainerStarted","Data":"27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f"} Mar 08 00:26:09.619220 master-0 kubenswrapper[7479]: I0308 00:26:09.619141 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" event={"ID":"786e30f1-d30a-43e1-85cb-d8ea1495422e","Type":"ContainerStarted","Data":"cdc0e9685b65a455abbaad494c4a6513ad0b9438ee9d2cc8a13ca432a7107cef"} Mar 08 00:26:09.619295 master-0 kubenswrapper[7479]: I0308 00:26:09.619235 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" event={"ID":"786e30f1-d30a-43e1-85cb-d8ea1495422e","Type":"ContainerStarted","Data":"8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced"} Mar 08 00:26:09.663383 master-0 kubenswrapper[7479]: I0308 00:26:09.663312 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:26:10.250909 master-0 kubenswrapper[7479]: I0308 00:26:10.250836 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" podStartSLOduration=347.250817655 podStartE2EDuration="5m47.250817655s" podCreationTimestamp="2026-03-08 00:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:10.246184851 +0000 UTC m=+286.559093768" watchObservedRunningTime="2026-03-08 00:26:10.250817655 +0000 UTC m=+286.563726572" Mar 08 00:26:10.612353 master-0 kubenswrapper[7479]: I0308 00:26:10.612240 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:26:10.612353 master-0 kubenswrapper[7479]: I0308 00:26:10.612294 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:26:10.665011 master-0 kubenswrapper[7479]: I0308 00:26:10.664958 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:26:10.698420 master-0 kubenswrapper[7479]: I0308 00:26:10.698370 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:26:11.019988 master-0 kubenswrapper[7479]: I0308 00:26:11.019903 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf"] Mar 08 00:26:11.020365 master-0 kubenswrapper[7479]: I0308 00:26:11.020145 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="cluster-cloud-controller-manager" containerID="cri-o://5a500d2c1f8696d0304f6d90b8b1ba2343bb37980187644821c808366f21e1a3" gracePeriod=30 Mar 08 00:26:11.021014 master-0 kubenswrapper[7479]: I0308 00:26:11.020612 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="config-sync-controllers" containerID="cri-o://448ae8ff53f7646a273cdf09b220fc2247ebe60a128d876e614d7cb7d241e38b" gracePeriod=30 Mar 08 00:26:11.021014 master-0 kubenswrapper[7479]: I0308 00:26:11.020717 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="kube-rbac-proxy" containerID="cri-o://2c55e9027e9db2cb3df9959e5475f9fd769e23cb7ecb353d1a2f6789fe41833c" gracePeriod=30 Mar 08 00:26:12.230574 master-0 kubenswrapper[7479]: I0308 00:26:12.230478 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:26:12.266038 master-0 kubenswrapper[7479]: I0308 00:26:12.265977 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:26:12.647399 master-0 kubenswrapper[7479]: I0308 00:26:12.647321 7479 generic.go:334] "Generic (PLEG): container finished" podID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerID="448ae8ff53f7646a273cdf09b220fc2247ebe60a128d876e614d7cb7d241e38b" exitCode=0 Mar 08 00:26:12.647399 master-0 kubenswrapper[7479]: I0308 00:26:12.647366 7479 generic.go:334] "Generic (PLEG): container finished" podID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerID="5a500d2c1f8696d0304f6d90b8b1ba2343bb37980187644821c808366f21e1a3" exitCode=0 Mar 08 00:26:12.647399 master-0 kubenswrapper[7479]: I0308 00:26:12.647380 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerDied","Data":"448ae8ff53f7646a273cdf09b220fc2247ebe60a128d876e614d7cb7d241e38b"} Mar 08 00:26:12.647846 master-0 kubenswrapper[7479]: I0308 00:26:12.647446 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerDied","Data":"5a500d2c1f8696d0304f6d90b8b1ba2343bb37980187644821c808366f21e1a3"} Mar 08 00:26:14.660112 master-0 kubenswrapper[7479]: I0308 00:26:14.660028 7479 generic.go:334] "Generic (PLEG): container finished" podID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerID="2c55e9027e9db2cb3df9959e5475f9fd769e23cb7ecb353d1a2f6789fe41833c" exitCode=0 Mar 08 00:26:14.660112 master-0 kubenswrapper[7479]: I0308 00:26:14.660080 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerDied","Data":"2c55e9027e9db2cb3df9959e5475f9fd769e23cb7ecb353d1a2f6789fe41833c"} Mar 08 00:26:16.446065 master-0 kubenswrapper[7479]: I0308 00:26:16.446017 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:26:16.643092 master-0 kubenswrapper[7479]: I0308 00:26:16.643012 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls\") pod \"3b9823a9-2491-44b5-8bf2-22352558a2a3\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " Mar 08 00:26:16.643092 master-0 kubenswrapper[7479]: I0308 00:26:16.643093 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hksvd\" (UniqueName: \"kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd\") pod \"3b9823a9-2491-44b5-8bf2-22352558a2a3\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " Mar 08 00:26:16.643406 master-0 kubenswrapper[7479]: I0308 00:26:16.643157 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config\") pod \"3b9823a9-2491-44b5-8bf2-22352558a2a3\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " Mar 08 00:26:16.643406 master-0 kubenswrapper[7479]: I0308 00:26:16.643186 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images\") pod \"3b9823a9-2491-44b5-8bf2-22352558a2a3\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " Mar 08 00:26:16.643406 master-0 kubenswrapper[7479]: I0308 00:26:16.643220 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube\") pod \"3b9823a9-2491-44b5-8bf2-22352558a2a3\" (UID: \"3b9823a9-2491-44b5-8bf2-22352558a2a3\") " Mar 08 00:26:16.643612 master-0 kubenswrapper[7479]: I0308 00:26:16.643517 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "3b9823a9-2491-44b5-8bf2-22352558a2a3" (UID: "3b9823a9-2491-44b5-8bf2-22352558a2a3"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:26:16.643666 master-0 kubenswrapper[7479]: I0308 00:26:16.643627 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "3b9823a9-2491-44b5-8bf2-22352558a2a3" (UID: "3b9823a9-2491-44b5-8bf2-22352558a2a3"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:26:16.643761 master-0 kubenswrapper[7479]: I0308 00:26:16.643707 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images" (OuterVolumeSpecName: "images") pod "3b9823a9-2491-44b5-8bf2-22352558a2a3" (UID: "3b9823a9-2491-44b5-8bf2-22352558a2a3"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:26:16.645801 master-0 kubenswrapper[7479]: I0308 00:26:16.645730 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "3b9823a9-2491-44b5-8bf2-22352558a2a3" (UID: "3b9823a9-2491-44b5-8bf2-22352558a2a3"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:26:16.645801 master-0 kubenswrapper[7479]: I0308 00:26:16.645756 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd" (OuterVolumeSpecName: "kube-api-access-hksvd") pod "3b9823a9-2491-44b5-8bf2-22352558a2a3" (UID: "3b9823a9-2491-44b5-8bf2-22352558a2a3"). InnerVolumeSpecName "kube-api-access-hksvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:26:16.672471 master-0 kubenswrapper[7479]: I0308 00:26:16.672407 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" event={"ID":"3b9823a9-2491-44b5-8bf2-22352558a2a3","Type":"ContainerDied","Data":"2f2c834142b8089008ca2ed22b0fe66afaaaaa4b94fca36925b116feb711bdca"} Mar 08 00:26:16.672471 master-0 kubenswrapper[7479]: I0308 00:26:16.672475 7479 scope.go:117] "RemoveContainer" containerID="2c55e9027e9db2cb3df9959e5475f9fd769e23cb7ecb353d1a2f6789fe41833c" Mar 08 00:26:16.672717 master-0 kubenswrapper[7479]: I0308 00:26:16.672473 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf" Mar 08 00:26:16.745281 master-0 kubenswrapper[7479]: I0308 00:26:16.745186 7479 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9823a9-2491-44b5-8bf2-22352558a2a3-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:16.745281 master-0 kubenswrapper[7479]: I0308 00:26:16.745258 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hksvd\" (UniqueName: \"kubernetes.io/projected/3b9823a9-2491-44b5-8bf2-22352558a2a3-kube-api-access-hksvd\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:16.745281 master-0 kubenswrapper[7479]: I0308 00:26:16.745273 7479 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:16.745281 master-0 kubenswrapper[7479]: I0308 00:26:16.745288 7479 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9823a9-2491-44b5-8bf2-22352558a2a3-images\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:16.745591 master-0 kubenswrapper[7479]: I0308 00:26:16.745301 7479 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b9823a9-2491-44b5-8bf2-22352558a2a3-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:17.454701 master-0 kubenswrapper[7479]: I0308 00:26:17.454629 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wkt98"] Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: E0308 00:26:17.455106 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="config-sync-controllers" Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: I0308 00:26:17.455120 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="config-sync-controllers" Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: E0308 00:26:17.455162 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="cluster-cloud-controller-manager" Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: I0308 00:26:17.455171 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="cluster-cloud-controller-manager" Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: E0308 00:26:17.455194 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="kube-rbac-proxy" Mar 08 00:26:17.455363 master-0 kubenswrapper[7479]: I0308 00:26:17.455221 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="kube-rbac-proxy" Mar 08 00:26:17.455580 master-0 kubenswrapper[7479]: I0308 00:26:17.455442 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="cluster-cloud-controller-manager" Mar 08 00:26:17.455580 master-0 kubenswrapper[7479]: I0308 00:26:17.455453 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="config-sync-controllers" Mar 08 00:26:17.455580 master-0 kubenswrapper[7479]: I0308 00:26:17.455467 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" containerName="kube-rbac-proxy" Mar 08 00:26:17.456064 master-0 kubenswrapper[7479]: I0308 00:26:17.456038 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.462119 master-0 kubenswrapper[7479]: I0308 00:26:17.462060 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:26:17.462323 master-0 kubenswrapper[7479]: I0308 00:26:17.462079 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-wkgmq" Mar 08 00:26:17.463913 master-0 kubenswrapper[7479]: I0308 00:26:17.463843 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:26:17.556648 master-0 kubenswrapper[7479]: I0308 00:26:17.556556 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.556853 master-0 kubenswrapper[7479]: I0308 00:26:17.556660 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.556853 master-0 kubenswrapper[7479]: I0308 00:26:17.556705 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncncc\" (UniqueName: \"kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.658350 master-0 kubenswrapper[7479]: I0308 00:26:17.658244 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.658586 master-0 kubenswrapper[7479]: I0308 00:26:17.658372 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncncc\" (UniqueName: \"kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.658586 master-0 kubenswrapper[7479]: I0308 00:26:17.658413 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.661585 master-0 kubenswrapper[7479]: I0308 00:26:17.661543 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.662113 master-0 kubenswrapper[7479]: I0308 00:26:17.662077 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.798897 master-0 kubenswrapper[7479]: I0308 00:26:17.798855 7479 scope.go:117] "RemoveContainer" containerID="448ae8ff53f7646a273cdf09b220fc2247ebe60a128d876e614d7cb7d241e38b" Mar 08 00:26:17.825322 master-0 kubenswrapper[7479]: I0308 00:26:17.824896 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncncc\" (UniqueName: \"kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:17.831954 master-0 kubenswrapper[7479]: I0308 00:26:17.825915 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf"] Mar 08 00:26:17.835860 master-0 kubenswrapper[7479]: I0308 00:26:17.832495 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-8lgqf"] Mar 08 00:26:17.853638 master-0 kubenswrapper[7479]: I0308 00:26:17.853601 7479 scope.go:117] "RemoveContainer" containerID="5a500d2c1f8696d0304f6d90b8b1ba2343bb37980187644821c808366f21e1a3" Mar 08 00:26:17.872288 master-0 kubenswrapper[7479]: I0308 00:26:17.871174 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq"] Mar 08 00:26:17.872457 master-0 kubenswrapper[7479]: I0308 00:26:17.872357 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:17.875882 master-0 kubenswrapper[7479]: I0308 00:26:17.875834 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 00:26:17.876308 master-0 kubenswrapper[7479]: I0308 00:26:17.876263 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 00:26:17.876565 master-0 kubenswrapper[7479]: I0308 00:26:17.876539 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 00:26:17.876688 master-0 kubenswrapper[7479]: I0308 00:26:17.876665 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:26:17.876800 master-0 kubenswrapper[7479]: I0308 00:26:17.876771 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:26:17.877004 master-0 kubenswrapper[7479]: I0308 00:26:17.876885 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-fp767" Mar 08 00:26:17.891684 master-0 kubenswrapper[7479]: I0308 00:26:17.891641 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b9823a9-2491-44b5-8bf2-22352558a2a3" path="/var/lib/kubelet/pods/3b9823a9-2491-44b5-8bf2-22352558a2a3/volumes" Mar 08 00:26:17.933834 master-0 kubenswrapper[7479]: I0308 00:26:17.933775 7479 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 00:26:17.934053 master-0 kubenswrapper[7479]: I0308 00:26:17.934017 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://0fe11e31bc3fff8b9610286a4d61bcdc774b24a696a35e7bd68af0798051cd1f" gracePeriod=30 Mar 08 00:26:17.937032 master-0 kubenswrapper[7479]: I0308 00:26:17.936989 7479 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:26:17.937363 master-0 kubenswrapper[7479]: E0308 00:26:17.937328 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.937363 master-0 kubenswrapper[7479]: I0308 00:26:17.937352 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.937649 master-0 kubenswrapper[7479]: I0308 00:26:17.937585 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.937840 master-0 kubenswrapper[7479]: E0308 00:26:17.937744 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.937840 master-0 kubenswrapper[7479]: I0308 00:26:17.937759 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.937940 master-0 kubenswrapper[7479]: I0308 00:26:17.937858 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 00:26:17.938804 master-0 kubenswrapper[7479]: I0308 00:26:17.938768 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.064012 master-0 kubenswrapper[7479]: I0308 00:26:18.063972 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.064230 master-0 kubenswrapper[7479]: I0308 00:26:18.064192 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.064290 master-0 kubenswrapper[7479]: I0308 00:26:18.064240 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.064290 master-0 kubenswrapper[7479]: I0308 00:26:18.064271 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4n9\" (UniqueName: \"kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.064391 master-0 kubenswrapper[7479]: I0308 00:26:18.064293 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.064391 master-0 kubenswrapper[7479]: I0308 00:26:18.064316 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.064538 master-0 kubenswrapper[7479]: I0308 00:26:18.064493 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.075535 master-0 kubenswrapper[7479]: I0308 00:26:18.075478 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:26:18.134388 master-0 kubenswrapper[7479]: W0308 00:26:18.134353 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda68ad726_392e_4a7a_a384_409108df9c8b.slice/crio-8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034 WatchSource:0}: Error finding container 8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034: Status 404 returned error can't find the container with id 8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034 Mar 08 00:26:18.166269 master-0 kubenswrapper[7479]: I0308 00:26:18.166054 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.166269 master-0 kubenswrapper[7479]: I0308 00:26:18.166105 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.166269 master-0 kubenswrapper[7479]: I0308 00:26:18.166176 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.166269 master-0 kubenswrapper[7479]: I0308 00:26:18.166249 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4n9\" (UniqueName: \"kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.166269 master-0 kubenswrapper[7479]: I0308 00:26:18.166272 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.166578 master-0 kubenswrapper[7479]: I0308 00:26:18.166302 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.166578 master-0 kubenswrapper[7479]: I0308 00:26:18.166347 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.166578 master-0 kubenswrapper[7479]: I0308 00:26:18.166381 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.166578 master-0 kubenswrapper[7479]: I0308 00:26:18.166403 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.166578 master-0 kubenswrapper[7479]: I0308 00:26:18.166447 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.167061 master-0 kubenswrapper[7479]: I0308 00:26:18.167024 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.167110 master-0 kubenswrapper[7479]: I0308 00:26:18.167060 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.168961 master-0 kubenswrapper[7479]: I0308 00:26:18.168922 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.181780 master-0 kubenswrapper[7479]: I0308 00:26:18.181742 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4n9\" (UniqueName: \"kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.280379 master-0 kubenswrapper[7479]: I0308 00:26:18.280347 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:26:18.290144 master-0 kubenswrapper[7479]: I0308 00:26:18.290102 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:18.294551 master-0 kubenswrapper[7479]: I0308 00:26:18.294490 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:26:18.298208 master-0 kubenswrapper[7479]: I0308 00:26:18.298154 7479 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="c4f5789d-5581-488e-9b73-530c8e4fa71e" Mar 08 00:26:18.307887 master-0 kubenswrapper[7479]: W0308 00:26:18.307809 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1453f6461bf5d599ad65a4656343ee91.slice/crio-635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88 WatchSource:0}: Error finding container 635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88: Status 404 returned error can't find the container with id 635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88 Mar 08 00:26:18.453471 master-0 kubenswrapper[7479]: I0308 00:26:18.453426 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:26:18.471020 master-0 kubenswrapper[7479]: I0308 00:26:18.470985 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 00:26:18.473388 master-0 kubenswrapper[7479]: I0308 00:26:18.471078 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 00:26:18.473388 master-0 kubenswrapper[7479]: I0308 00:26:18.471065 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:26:18.473388 master-0 kubenswrapper[7479]: I0308 00:26:18.471127 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:26:18.473388 master-0 kubenswrapper[7479]: I0308 00:26:18.471274 7479 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:18.473388 master-0 kubenswrapper[7479]: I0308 00:26:18.471290 7479 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:18.477683 master-0 kubenswrapper[7479]: W0308 00:26:18.475362 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4f8517_1e54_4b41_ba6b_6c56fe66831a.slice/crio-79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3 WatchSource:0}: Error finding container 79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3: Status 404 returned error can't find the container with id 79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3 Mar 08 00:26:18.688834 master-0 kubenswrapper[7479]: I0308 00:26:18.688803 7479 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="0fe11e31bc3fff8b9610286a4d61bcdc774b24a696a35e7bd68af0798051cd1f" exitCode=0 Mar 08 00:26:18.688948 master-0 kubenswrapper[7479]: I0308 00:26:18.688858 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="233074eccbbd3406930dc094592b256b0710cbbbba4d96b37f6401353d1f1651" Mar 08 00:26:18.688948 master-0 kubenswrapper[7479]: I0308 00:26:18.688873 7479 scope.go:117] "RemoveContainer" containerID="88fd43c8fda6129c4f06b24e2a215771ea123f05c39828ad062d2af5324239c2" Mar 08 00:26:18.688948 master-0 kubenswrapper[7479]: I0308 00:26:18.688931 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 00:26:18.693529 master-0 kubenswrapper[7479]: I0308 00:26:18.693174 7479 generic.go:334] "Generic (PLEG): container finished" podID="21dd42b1-2628-4a24-97e7-6759888ed316" containerID="f70bb9a5f0e3f9b911feb28654c30e151d3e1fb5d9549e6e2016049387b17fb2" exitCode=0 Mar 08 00:26:18.693529 master-0 kubenswrapper[7479]: I0308 00:26:18.693236 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerDied","Data":"f70bb9a5f0e3f9b911feb28654c30e151d3e1fb5d9549e6e2016049387b17fb2"} Mar 08 00:26:18.697704 master-0 kubenswrapper[7479]: I0308 00:26:18.697136 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wkt98" event={"ID":"a68ad726-392e-4a7a-a384-409108df9c8b","Type":"ContainerStarted","Data":"51d72f735ac2d22ad572e2bfd6c4c3d9ef60ea8d95d8d615afffbd72430f0283"} Mar 08 00:26:18.697704 master-0 kubenswrapper[7479]: I0308 00:26:18.697173 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wkt98" event={"ID":"a68ad726-392e-4a7a-a384-409108df9c8b","Type":"ContainerStarted","Data":"8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034"} Mar 08 00:26:18.698615 master-0 kubenswrapper[7479]: I0308 00:26:18.698556 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b"} Mar 08 00:26:18.700454 master-0 kubenswrapper[7479]: I0308 00:26:18.700427 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" event={"ID":"0d0cb126-341c-4215-ad2e-a008193cc0b5","Type":"ContainerStarted","Data":"6e25dc9a5f14568083319c0b4bbd12c19766fcb10a82c2e247c421c6684c8ec8"} Mar 08 00:26:18.702345 master-0 kubenswrapper[7479]: I0308 00:26:18.702026 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:18.708270 master-0 kubenswrapper[7479]: I0308 00:26:18.704735 7479 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" exitCode=0 Mar 08 00:26:18.708270 master-0 kubenswrapper[7479]: I0308 00:26:18.704810 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d"} Mar 08 00:26:18.708270 master-0 kubenswrapper[7479]: I0308 00:26:18.704837 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88"} Mar 08 00:26:18.708270 master-0 kubenswrapper[7479]: I0308 00:26:18.707615 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:26:18.717008 master-0 kubenswrapper[7479]: I0308 00:26:18.716644 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"6b085935f4ebb70afc5a958163f7053b9a42b89c690b039c32d56dcc51668fae"} Mar 08 00:26:18.717008 master-0 kubenswrapper[7479]: I0308 00:26:18.716697 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3"} Mar 08 00:26:18.759179 master-0 kubenswrapper[7479]: I0308 00:26:18.759087 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" podStartSLOduration=238.744733712 podStartE2EDuration="4m7.759067223s" podCreationTimestamp="2026-03-08 00:22:11 +0000 UTC" firstStartedPulling="2026-03-08 00:26:08.80330869 +0000 UTC m=+285.116217607" lastFinishedPulling="2026-03-08 00:26:17.817642201 +0000 UTC m=+294.130551118" observedRunningTime="2026-03-08 00:26:18.754837914 +0000 UTC m=+295.067746831" watchObservedRunningTime="2026-03-08 00:26:18.759067223 +0000 UTC m=+295.071976140" Mar 08 00:26:18.778637 master-0 kubenswrapper[7479]: I0308 00:26:18.778458 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podStartSLOduration=252.376849615 podStartE2EDuration="4m21.778440807s" podCreationTimestamp="2026-03-08 00:21:57 +0000 UTC" firstStartedPulling="2026-03-08 00:26:08.407495441 +0000 UTC m=+284.720404358" lastFinishedPulling="2026-03-08 00:26:17.809086633 +0000 UTC m=+294.121995550" observedRunningTime="2026-03-08 00:26:18.776637906 +0000 UTC m=+295.089546843" watchObservedRunningTime="2026-03-08 00:26:18.778440807 +0000 UTC m=+295.091349724" Mar 08 00:26:18.797131 master-0 kubenswrapper[7479]: I0308 00:26:18.795501 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wkt98" podStartSLOduration=2.795483764 podStartE2EDuration="2.795483764s" podCreationTimestamp="2026-03-08 00:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:18.795388773 +0000 UTC m=+295.108297700" watchObservedRunningTime="2026-03-08 00:26:18.795483764 +0000 UTC m=+295.108392681" Mar 08 00:26:19.365386 master-0 kubenswrapper[7479]: I0308 00:26:19.364795 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:19.369009 master-0 kubenswrapper[7479]: I0308 00:26:19.368960 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:19.369009 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:19.369009 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:19.369009 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:19.369312 master-0 kubenswrapper[7479]: I0308 00:26:19.369020 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:19.735115 master-0 kubenswrapper[7479]: I0308 00:26:19.735055 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} Mar 08 00:26:19.735115 master-0 kubenswrapper[7479]: I0308 00:26:19.735113 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} Mar 08 00:26:19.737274 master-0 kubenswrapper[7479]: I0308 00:26:19.737238 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"2a75a237fef308cfc9e8dc829c307d2c38c0fdad09816e4ff80123079e47f8b1"} Mar 08 00:26:19.737324 master-0 kubenswrapper[7479]: I0308 00:26:19.737276 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"6703d449ef58e82f6711f4fb4077c407ce4e8f1fc186664220b3722e268d3aa7"} Mar 08 00:26:19.767096 master-0 kubenswrapper[7479]: I0308 00:26:19.767024 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" podStartSLOduration=2.766997563 podStartE2EDuration="2.766997563s" podCreationTimestamp="2026-03-08 00:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:19.763154709 +0000 UTC m=+296.076063636" watchObservedRunningTime="2026-03-08 00:26:19.766997563 +0000 UTC m=+296.079906480" Mar 08 00:26:19.774291 master-0 kubenswrapper[7479]: I0308 00:26:19.774227 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9"] Mar 08 00:26:19.775652 master-0 kubenswrapper[7479]: I0308 00:26:19.775345 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.780557 master-0 kubenswrapper[7479]: I0308 00:26:19.780247 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 00:26:19.780557 master-0 kubenswrapper[7479]: I0308 00:26:19.780292 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qlv59" Mar 08 00:26:19.780557 master-0 kubenswrapper[7479]: I0308 00:26:19.780413 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 00:26:19.786592 master-0 kubenswrapper[7479]: I0308 00:26:19.786541 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 00:26:19.790562 master-0 kubenswrapper[7479]: I0308 00:26:19.790502 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9"] Mar 08 00:26:19.894089 master-0 kubenswrapper[7479]: I0308 00:26:19.892642 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.894089 master-0 kubenswrapper[7479]: I0308 00:26:19.892748 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.894089 master-0 kubenswrapper[7479]: I0308 00:26:19.892894 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t99pg\" (UniqueName: \"kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.894089 master-0 kubenswrapper[7479]: I0308 00:26:19.892961 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.908455 master-0 kubenswrapper[7479]: I0308 00:26:19.908400 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 08 00:26:19.908665 master-0 kubenswrapper[7479]: I0308 00:26:19.908641 7479 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 08 00:26:19.927124 master-0 kubenswrapper[7479]: I0308 00:26:19.927078 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 00:26:19.927435 master-0 kubenswrapper[7479]: I0308 00:26:19.927413 7479 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="c4f5789d-5581-488e-9b73-530c8e4fa71e" Mar 08 00:26:19.927532 master-0 kubenswrapper[7479]: I0308 00:26:19.927520 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 00:26:19.927610 master-0 kubenswrapper[7479]: I0308 00:26:19.927597 7479 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="c4f5789d-5581-488e-9b73-530c8e4fa71e" Mar 08 00:26:19.993941 master-0 kubenswrapper[7479]: I0308 00:26:19.993805 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t99pg\" (UniqueName: \"kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.993941 master-0 kubenswrapper[7479]: I0308 00:26:19.993885 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.998486 master-0 kubenswrapper[7479]: I0308 00:26:19.994048 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.998486 master-0 kubenswrapper[7479]: I0308 00:26:19.994197 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.998486 master-0 kubenswrapper[7479]: I0308 00:26:19.994796 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.998486 master-0 kubenswrapper[7479]: I0308 00:26:19.997955 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:19.998486 master-0 kubenswrapper[7479]: I0308 00:26:19.997945 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:20.012689 master-0 kubenswrapper[7479]: I0308 00:26:20.012661 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t99pg\" (UniqueName: \"kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:20.032986 master-0 kubenswrapper[7479]: I0308 00:26:20.032946 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:26:20.113738 master-0 kubenswrapper[7479]: I0308 00:26:20.113547 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:26:20.197742 master-0 kubenswrapper[7479]: I0308 00:26:20.197612 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir\") pod \"21dd42b1-2628-4a24-97e7-6759888ed316\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " Mar 08 00:26:20.197742 master-0 kubenswrapper[7479]: I0308 00:26:20.197700 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access\") pod \"21dd42b1-2628-4a24-97e7-6759888ed316\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " Mar 08 00:26:20.197742 master-0 kubenswrapper[7479]: I0308 00:26:20.197724 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock\") pod \"21dd42b1-2628-4a24-97e7-6759888ed316\" (UID: \"21dd42b1-2628-4a24-97e7-6759888ed316\") " Mar 08 00:26:20.198007 master-0 kubenswrapper[7479]: I0308 00:26:20.197983 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock" (OuterVolumeSpecName: "var-lock") pod "21dd42b1-2628-4a24-97e7-6759888ed316" (UID: "21dd42b1-2628-4a24-97e7-6759888ed316"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:26:20.198057 master-0 kubenswrapper[7479]: I0308 00:26:20.198019 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21dd42b1-2628-4a24-97e7-6759888ed316" (UID: "21dd42b1-2628-4a24-97e7-6759888ed316"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:26:20.201299 master-0 kubenswrapper[7479]: I0308 00:26:20.201264 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21dd42b1-2628-4a24-97e7-6759888ed316" (UID: "21dd42b1-2628-4a24-97e7-6759888ed316"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:26:20.299422 master-0 kubenswrapper[7479]: I0308 00:26:20.299362 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:20.299422 master-0 kubenswrapper[7479]: I0308 00:26:20.299394 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dd42b1-2628-4a24-97e7-6759888ed316-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:20.299422 master-0 kubenswrapper[7479]: I0308 00:26:20.299404 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21dd42b1-2628-4a24-97e7-6759888ed316-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:26:20.369032 master-0 kubenswrapper[7479]: I0308 00:26:20.368885 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:20.369032 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:20.369032 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:20.369032 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:20.369032 master-0 kubenswrapper[7479]: I0308 00:26:20.368948 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:20.508980 master-0 kubenswrapper[7479]: I0308 00:26:20.508901 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9"] Mar 08 00:26:20.748464 master-0 kubenswrapper[7479]: I0308 00:26:20.748416 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerDied","Data":"f81e16a049afccd7df86e2ab910ff92e4bea5bed8e76ac4e62191e1c15f7228a"} Mar 08 00:26:20.748464 master-0 kubenswrapper[7479]: I0308 00:26:20.748457 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81e16a049afccd7df86e2ab910ff92e4bea5bed8e76ac4e62191e1c15f7228a" Mar 08 00:26:20.749120 master-0 kubenswrapper[7479]: I0308 00:26:20.748480 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:26:20.749555 master-0 kubenswrapper[7479]: I0308 00:26:20.749529 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"aaafa12a616f7369af11bbeebe18962338e3a83e1b72c0a692864a7176225e0a"} Mar 08 00:26:20.751914 master-0 kubenswrapper[7479]: I0308 00:26:20.751881 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b"} Mar 08 00:26:20.773099 master-0 kubenswrapper[7479]: I0308 00:26:20.773025 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.7730048800000002 podStartE2EDuration="2.77300488s" podCreationTimestamp="2026-03-08 00:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:26:20.772172221 +0000 UTC m=+297.085081138" watchObservedRunningTime="2026-03-08 00:26:20.77300488 +0000 UTC m=+297.085913797" Mar 08 00:26:21.367290 master-0 kubenswrapper[7479]: I0308 00:26:21.367239 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:21.367290 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:21.367290 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:21.367290 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:21.367552 master-0 kubenswrapper[7479]: I0308 00:26:21.367314 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:21.757785 master-0 kubenswrapper[7479]: I0308 00:26:21.757715 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:26:22.367064 master-0 kubenswrapper[7479]: I0308 00:26:22.366941 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:22.367064 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:22.367064 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:22.367064 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:22.367064 master-0 kubenswrapper[7479]: I0308 00:26:22.367019 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:22.773189 master-0 kubenswrapper[7479]: I0308 00:26:22.773104 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"8024d8e07c10843d58afa6b354d719252942b7cc674963d8b1fab2a5ad838405"} Mar 08 00:26:22.773189 master-0 kubenswrapper[7479]: I0308 00:26:22.773187 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"91975b539efd51be35527a0d8a61481b74eddf77df1b9a337c3002feaa1bf444"} Mar 08 00:26:22.799102 master-0 kubenswrapper[7479]: I0308 00:26:22.798951 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" podStartSLOduration=2.4356817680000002 podStartE2EDuration="3.798935508s" podCreationTimestamp="2026-03-08 00:26:19 +0000 UTC" firstStartedPulling="2026-03-08 00:26:20.518069702 +0000 UTC m=+296.830978619" lastFinishedPulling="2026-03-08 00:26:21.881323442 +0000 UTC m=+298.194232359" observedRunningTime="2026-03-08 00:26:22.797223108 +0000 UTC m=+299.110132045" watchObservedRunningTime="2026-03-08 00:26:22.798935508 +0000 UTC m=+299.111844415" Mar 08 00:26:23.367540 master-0 kubenswrapper[7479]: I0308 00:26:23.367476 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:23.367540 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:23.367540 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:23.367540 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:23.367540 master-0 kubenswrapper[7479]: I0308 00:26:23.367538 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:24.367748 master-0 kubenswrapper[7479]: I0308 00:26:24.367628 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:24.367748 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:24.367748 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:24.367748 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:24.367748 master-0 kubenswrapper[7479]: I0308 00:26:24.367708 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:24.876549 master-0 kubenswrapper[7479]: I0308 00:26:24.876474 7479 scope.go:117] "RemoveContainer" containerID="79807bacb8255c5e003178362fd0a6e9b3e5481074aa31458cc27f40ce6114ac" Mar 08 00:26:25.369484 master-0 kubenswrapper[7479]: I0308 00:26:25.367215 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:25.369484 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:25.369484 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:25.369484 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:25.369484 master-0 kubenswrapper[7479]: I0308 00:26:25.367277 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:25.450278 master-0 kubenswrapper[7479]: I0308 00:26:25.450220 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v"] Mar 08 00:26:25.450531 master-0 kubenswrapper[7479]: E0308 00:26:25.450503 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:26:25.450531 master-0 kubenswrapper[7479]: I0308 00:26:25.450525 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:26:25.450705 master-0 kubenswrapper[7479]: I0308 00:26:25.450659 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:26:25.451608 master-0 kubenswrapper[7479]: I0308 00:26:25.451585 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.455475 master-0 kubenswrapper[7479]: I0308 00:26:25.455176 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 00:26:25.455610 master-0 kubenswrapper[7479]: I0308 00:26:25.455495 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bmgck" Mar 08 00:26:25.455696 master-0 kubenswrapper[7479]: I0308 00:26:25.455666 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 00:26:25.500991 master-0 kubenswrapper[7479]: I0308 00:26:25.500936 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v"] Mar 08 00:26:25.503907 master-0 kubenswrapper[7479]: I0308 00:26:25.503853 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bx9dn"] Mar 08 00:26:25.505272 master-0 kubenswrapper[7479]: I0308 00:26:25.505242 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.507908 master-0 kubenswrapper[7479]: I0308 00:26:25.507699 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-2mmjv" Mar 08 00:26:25.507908 master-0 kubenswrapper[7479]: I0308 00:26:25.507879 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 00:26:25.510541 master-0 kubenswrapper[7479]: I0308 00:26:25.508318 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 00:26:25.550227 master-0 kubenswrapper[7479]: I0308 00:26:25.549459 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc"] Mar 08 00:26:25.552522 master-0 kubenswrapper[7479]: I0308 00:26:25.550486 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.552522 master-0 kubenswrapper[7479]: I0308 00:26:25.552169 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 00:26:25.556926 master-0 kubenswrapper[7479]: I0308 00:26:25.554398 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 00:26:25.556926 master-0 kubenswrapper[7479]: I0308 00:26:25.555078 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5qzcm" Mar 08 00:26:25.556926 master-0 kubenswrapper[7479]: I0308 00:26:25.555162 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 00:26:25.566498 master-0 kubenswrapper[7479]: I0308 00:26:25.566440 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566667 master-0 kubenswrapper[7479]: I0308 00:26:25.566636 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566705 master-0 kubenswrapper[7479]: I0308 00:26:25.566667 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566705 master-0 kubenswrapper[7479]: I0308 00:26:25.566689 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.566789 master-0 kubenswrapper[7479]: I0308 00:26:25.566718 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566789 master-0 kubenswrapper[7479]: I0308 00:26:25.566735 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxt7\" (UniqueName: \"kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.566789 master-0 kubenswrapper[7479]: I0308 00:26:25.566771 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.566789 master-0 kubenswrapper[7479]: I0308 00:26:25.566786 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566912 master-0 kubenswrapper[7479]: I0308 00:26:25.566811 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.566912 master-0 kubenswrapper[7479]: I0308 00:26:25.566836 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566912 master-0 kubenswrapper[7479]: I0308 00:26:25.566855 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.566912 master-0 kubenswrapper[7479]: I0308 00:26:25.566877 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wllt8\" (UniqueName: \"kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.581164 master-0 kubenswrapper[7479]: I0308 00:26:25.581092 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc"] Mar 08 00:26:25.668512 master-0 kubenswrapper[7479]: I0308 00:26:25.668385 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.668512 master-0 kubenswrapper[7479]: I0308 00:26:25.668447 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668512 master-0 kubenswrapper[7479]: I0308 00:26:25.668481 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.668512 master-0 kubenswrapper[7479]: I0308 00:26:25.668507 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668534 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668557 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllt8\" (UniqueName: \"kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668588 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668613 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668634 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668669 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668695 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668722 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.668767 master-0 kubenswrapper[7479]: I0308 00:26:25.668749 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.669030 master-0 kubenswrapper[7479]: I0308 00:26:25.668783 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxt7\" (UniqueName: \"kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.669030 master-0 kubenswrapper[7479]: I0308 00:26:25.668812 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.669030 master-0 kubenswrapper[7479]: I0308 00:26:25.668842 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.669030 master-0 kubenswrapper[7479]: I0308 00:26:25.668877 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznbf\" (UniqueName: \"kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.669030 master-0 kubenswrapper[7479]: I0308 00:26:25.668897 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.670066 master-0 kubenswrapper[7479]: I0308 00:26:25.669609 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.670066 master-0 kubenswrapper[7479]: I0308 00:26:25.669678 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.670313 master-0 kubenswrapper[7479]: I0308 00:26:25.670280 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.670367 master-0 kubenswrapper[7479]: I0308 00:26:25.670316 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.671517 master-0 kubenswrapper[7479]: I0308 00:26:25.671486 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.671851 master-0 kubenswrapper[7479]: I0308 00:26:25.671816 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.679625 master-0 kubenswrapper[7479]: I0308 00:26:25.673582 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.679625 master-0 kubenswrapper[7479]: I0308 00:26:25.674048 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.679625 master-0 kubenswrapper[7479]: I0308 00:26:25.678893 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.722490 master-0 kubenswrapper[7479]: I0308 00:26:25.686586 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxt7\" (UniqueName: \"kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.722490 master-0 kubenswrapper[7479]: I0308 00:26:25.691269 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllt8\" (UniqueName: \"kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.723031 master-0 kubenswrapper[7479]: I0308 00:26:25.722902 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.767381 master-0 kubenswrapper[7479]: I0308 00:26:25.767269 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.769716 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.769792 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.769829 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.769876 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qznbf\" (UniqueName: \"kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.770165 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.770318 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.770429 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.771846 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.773769 master-0 kubenswrapper[7479]: I0308 00:26:25.771923 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.775794 master-0 kubenswrapper[7479]: I0308 00:26:25.775064 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.777975 master-0 kubenswrapper[7479]: I0308 00:26:25.776810 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.797429 master-0 kubenswrapper[7479]: I0308 00:26:25.797375 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznbf\" (UniqueName: \"kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:25.830170 master-0 kubenswrapper[7479]: I0308 00:26:25.830073 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:26:25.852356 master-0 kubenswrapper[7479]: W0308 00:26:25.850980 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24ef1fb7_c8a1_4b50_b89f_2a81848ebb25.slice/crio-88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68 WatchSource:0}: Error finding container 88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68: Status 404 returned error can't find the container with id 88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68 Mar 08 00:26:25.855132 master-0 kubenswrapper[7479]: I0308 00:26:25.855110 7479 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:26:25.881286 master-0 kubenswrapper[7479]: I0308 00:26:25.881237 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:26:26.163029 master-0 kubenswrapper[7479]: I0308 00:26:26.162391 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v"] Mar 08 00:26:26.164168 master-0 kubenswrapper[7479]: W0308 00:26:26.164050 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9f4db1_3ba9_49a5_9a65_1d770ee59a65.slice/crio-55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d WatchSource:0}: Error finding container 55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d: Status 404 returned error can't find the container with id 55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d Mar 08 00:26:26.280101 master-0 kubenswrapper[7479]: I0308 00:26:26.279904 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc"] Mar 08 00:26:26.283311 master-0 kubenswrapper[7479]: W0308 00:26:26.283270 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae061e84_5e6a_415c_a735_fa14add7318a.slice/crio-c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000 WatchSource:0}: Error finding container c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000: Status 404 returned error can't find the container with id c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000 Mar 08 00:26:26.367016 master-0 kubenswrapper[7479]: I0308 00:26:26.366667 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:26.367016 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:26.367016 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:26.367016 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:26.367016 master-0 kubenswrapper[7479]: I0308 00:26:26.366714 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:26.800938 master-0 kubenswrapper[7479]: I0308 00:26:26.800856 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000"} Mar 08 00:26:26.802054 master-0 kubenswrapper[7479]: I0308 00:26:26.802015 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68"} Mar 08 00:26:26.803888 master-0 kubenswrapper[7479]: I0308 00:26:26.803850 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"348aea2a915fd68a226048223a20a87a7f16c78c005410713b0290068a8f6dc3"} Mar 08 00:26:26.803888 master-0 kubenswrapper[7479]: I0308 00:26:26.803883 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"324a3f66919d93d357f8f2bce22ca197a2c40c573bb476ff1dafbf1389ca9177"} Mar 08 00:26:26.803969 master-0 kubenswrapper[7479]: I0308 00:26:26.803894 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d"} Mar 08 00:26:27.366503 master-0 kubenswrapper[7479]: I0308 00:26:27.366354 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:27.366503 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:27.366503 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:27.366503 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:27.366738 master-0 kubenswrapper[7479]: I0308 00:26:27.366466 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:27.809633 master-0 kubenswrapper[7479]: I0308 00:26:27.809568 7479 generic.go:334] "Generic (PLEG): container finished" podID="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" containerID="95c20172ebbb05524877a835e30132f4f70ded4813cb99373d344901a324181d" exitCode=0 Mar 08 00:26:27.809633 master-0 kubenswrapper[7479]: I0308 00:26:27.809626 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerDied","Data":"95c20172ebbb05524877a835e30132f4f70ded4813cb99373d344901a324181d"} Mar 08 00:26:28.365569 master-0 kubenswrapper[7479]: I0308 00:26:28.365378 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:26:28.393890 master-0 kubenswrapper[7479]: I0308 00:26:28.393823 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:28.393890 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:28.393890 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:28.393890 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:28.394331 master-0 kubenswrapper[7479]: I0308 00:26:28.393891 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:28.821998 master-0 kubenswrapper[7479]: I0308 00:26:28.821910 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"32dcf127d578ad6c3485b23863e0464ac0748c6e4e51332f9bfa899ee478383c"} Mar 08 00:26:28.821998 master-0 kubenswrapper[7479]: I0308 00:26:28.821997 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"7861ba3338916d9e9552052b5b66db2f7a34066b6d4805406b4ac88bb57796dc"} Mar 08 00:26:28.934413 master-0 kubenswrapper[7479]: I0308 00:26:28.934300 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bx9dn" podStartSLOduration=2.830750987 podStartE2EDuration="3.934264903s" podCreationTimestamp="2026-03-08 00:26:25 +0000 UTC" firstStartedPulling="2026-03-08 00:26:25.855016752 +0000 UTC m=+302.167925669" lastFinishedPulling="2026-03-08 00:26:26.958530678 +0000 UTC m=+303.271439585" observedRunningTime="2026-03-08 00:26:28.929352606 +0000 UTC m=+305.242261523" watchObservedRunningTime="2026-03-08 00:26:28.934264903 +0000 UTC m=+305.247173820" Mar 08 00:26:29.631585 master-0 kubenswrapper[7479]: I0308 00:26:29.631514 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:29.631585 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:29.631585 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:29.631585 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:29.632025 master-0 kubenswrapper[7479]: I0308 00:26:29.631628 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:30.367106 master-0 kubenswrapper[7479]: I0308 00:26:30.366946 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:30.367106 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:30.367106 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:30.367106 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:30.367106 master-0 kubenswrapper[7479]: I0308 00:26:30.367079 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:30.848272 master-0 kubenswrapper[7479]: I0308 00:26:30.848182 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"9fbf00cbaa1fd82a7fe4efbcd60b1cc35a5cc55ea94035411c2b6572009208c7"} Mar 08 00:26:30.848272 master-0 kubenswrapper[7479]: I0308 00:26:30.848252 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"562107d3f93627171c40e2da601929ce58908bf598b7af4d1af0d420323bb2a7"} Mar 08 00:26:30.851266 master-0 kubenswrapper[7479]: I0308 00:26:30.851226 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"117c49c3263ee766fe1829d23251703c5640786c8cddbb7c33f70514fe438945"} Mar 08 00:26:30.879315 master-0 kubenswrapper[7479]: I0308 00:26:30.877301 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" podStartSLOduration=1.819164666 podStartE2EDuration="5.877284402s" podCreationTimestamp="2026-03-08 00:26:25 +0000 UTC" firstStartedPulling="2026-03-08 00:26:26.381450682 +0000 UTC m=+302.694359599" lastFinishedPulling="2026-03-08 00:26:30.439570418 +0000 UTC m=+306.752479335" observedRunningTime="2026-03-08 00:26:30.874627681 +0000 UTC m=+307.187536588" watchObservedRunningTime="2026-03-08 00:26:30.877284402 +0000 UTC m=+307.190193319" Mar 08 00:26:31.010800 master-0 kubenswrapper[7479]: I0308 00:26:31.010731 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:26:31.012535 master-0 kubenswrapper[7479]: I0308 00:26:31.011473 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.013375 master-0 kubenswrapper[7479]: I0308 00:26:31.013332 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 00:26:31.014834 master-0 kubenswrapper[7479]: I0308 00:26:31.013995 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 00:26:31.014834 master-0 kubenswrapper[7479]: I0308 00:26:31.014323 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 00:26:31.014834 master-0 kubenswrapper[7479]: I0308 00:26:31.014570 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 00:26:31.014834 master-0 kubenswrapper[7479]: I0308 00:26:31.014707 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-ffspe3f0nbfal" Mar 08 00:26:31.015584 master-0 kubenswrapper[7479]: I0308 00:26:31.015515 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsc4j" Mar 08 00:26:31.028096 master-0 kubenswrapper[7479]: I0308 00:26:31.028042 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:26:31.146087 master-0 kubenswrapper[7479]: I0308 00:26:31.146002 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146087 master-0 kubenswrapper[7479]: I0308 00:26:31.146068 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146087 master-0 kubenswrapper[7479]: I0308 00:26:31.146118 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146699 master-0 kubenswrapper[7479]: I0308 00:26:31.146157 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146699 master-0 kubenswrapper[7479]: I0308 00:26:31.146184 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146699 master-0 kubenswrapper[7479]: I0308 00:26:31.146221 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.146699 master-0 kubenswrapper[7479]: I0308 00:26:31.146244 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247674 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247748 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247772 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247805 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247833 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247861 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.247860 master-0 kubenswrapper[7479]: I0308 00:26:31.247880 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.249896 master-0 kubenswrapper[7479]: I0308 00:26:31.249575 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.250150 master-0 kubenswrapper[7479]: I0308 00:26:31.250007 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.250705 master-0 kubenswrapper[7479]: I0308 00:26:31.250672 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.251243 master-0 kubenswrapper[7479]: I0308 00:26:31.251221 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.253536 master-0 kubenswrapper[7479]: I0308 00:26:31.253499 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.253929 master-0 kubenswrapper[7479]: I0308 00:26:31.253892 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.268426 master-0 kubenswrapper[7479]: I0308 00:26:31.268371 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.328722 master-0 kubenswrapper[7479]: I0308 00:26:31.328641 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:31.368390 master-0 kubenswrapper[7479]: I0308 00:26:31.368325 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:31.368390 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:31.368390 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:31.368390 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:31.368997 master-0 kubenswrapper[7479]: I0308 00:26:31.368413 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:31.752074 master-0 kubenswrapper[7479]: I0308 00:26:31.752002 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:26:31.864577 master-0 kubenswrapper[7479]: I0308 00:26:31.864532 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"259daa6bdeec002c66ab5644c463905cc1e9ced2ca36801084d0b2095f73b07b"} Mar 08 00:26:31.867161 master-0 kubenswrapper[7479]: I0308 00:26:31.867090 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerStarted","Data":"e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6"} Mar 08 00:26:31.890413 master-0 kubenswrapper[7479]: I0308 00:26:31.890156 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" podStartSLOduration=2.7368042299999997 podStartE2EDuration="6.890139018s" podCreationTimestamp="2026-03-08 00:26:25 +0000 UTC" firstStartedPulling="2026-03-08 00:26:26.285130157 +0000 UTC m=+302.598039084" lastFinishedPulling="2026-03-08 00:26:30.438464965 +0000 UTC m=+306.751373872" observedRunningTime="2026-03-08 00:26:31.887249494 +0000 UTC m=+308.200158451" watchObservedRunningTime="2026-03-08 00:26:31.890139018 +0000 UTC m=+308.203047935" Mar 08 00:26:32.367467 master-0 kubenswrapper[7479]: I0308 00:26:32.367401 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:32.367467 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:32.367467 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:32.367467 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:32.367751 master-0 kubenswrapper[7479]: I0308 00:26:32.367481 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:33.367687 master-0 kubenswrapper[7479]: I0308 00:26:33.367610 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:33.367687 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:33.367687 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:33.367687 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:33.368316 master-0 kubenswrapper[7479]: I0308 00:26:33.367725 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:34.367910 master-0 kubenswrapper[7479]: I0308 00:26:34.367811 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:34.367910 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:34.367910 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:34.367910 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:34.368631 master-0 kubenswrapper[7479]: I0308 00:26:34.367940 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:34.890411 master-0 kubenswrapper[7479]: I0308 00:26:34.890320 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerStarted","Data":"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c"} Mar 08 00:26:34.915708 master-0 kubenswrapper[7479]: I0308 00:26:34.915620 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" podStartSLOduration=2.858902406 podStartE2EDuration="4.915593389s" podCreationTimestamp="2026-03-08 00:26:30 +0000 UTC" firstStartedPulling="2026-03-08 00:26:31.760422407 +0000 UTC m=+308.073331334" lastFinishedPulling="2026-03-08 00:26:33.81711341 +0000 UTC m=+310.130022317" observedRunningTime="2026-03-08 00:26:34.914773479 +0000 UTC m=+311.227682386" watchObservedRunningTime="2026-03-08 00:26:34.915593389 +0000 UTC m=+311.228502346" Mar 08 00:26:35.367277 master-0 kubenswrapper[7479]: I0308 00:26:35.367223 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:35.367277 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:35.367277 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:35.367277 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:35.367667 master-0 kubenswrapper[7479]: I0308 00:26:35.367638 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:36.367436 master-0 kubenswrapper[7479]: I0308 00:26:36.367345 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:36.367436 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:36.367436 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:36.367436 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:36.368544 master-0 kubenswrapper[7479]: I0308 00:26:36.367440 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:37.367534 master-0 kubenswrapper[7479]: I0308 00:26:37.367483 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:37.367534 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:37.367534 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:37.367534 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:37.367534 master-0 kubenswrapper[7479]: I0308 00:26:37.367535 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:38.372287 master-0 kubenswrapper[7479]: I0308 00:26:38.368361 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:38.372287 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:38.372287 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:38.372287 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:38.372287 master-0 kubenswrapper[7479]: I0308 00:26:38.368444 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:39.367804 master-0 kubenswrapper[7479]: I0308 00:26:39.367720 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:39.367804 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:39.367804 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:39.367804 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:39.368301 master-0 kubenswrapper[7479]: I0308 00:26:39.367825 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:40.367254 master-0 kubenswrapper[7479]: I0308 00:26:40.367177 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:40.367254 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:40.367254 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:40.367254 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:40.367816 master-0 kubenswrapper[7479]: I0308 00:26:40.367265 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:41.370451 master-0 kubenswrapper[7479]: I0308 00:26:41.368487 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:41.370451 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:41.370451 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:41.370451 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:41.370451 master-0 kubenswrapper[7479]: I0308 00:26:41.368578 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:42.367657 master-0 kubenswrapper[7479]: I0308 00:26:42.367552 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:42.367657 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:42.367657 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:42.367657 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:42.367657 master-0 kubenswrapper[7479]: I0308 00:26:42.367633 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:43.367184 master-0 kubenswrapper[7479]: I0308 00:26:43.367120 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:43.367184 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:43.367184 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:43.367184 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:43.367897 master-0 kubenswrapper[7479]: I0308 00:26:43.367185 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:44.367486 master-0 kubenswrapper[7479]: I0308 00:26:44.367409 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:44.367486 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:44.367486 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:44.367486 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:44.367486 master-0 kubenswrapper[7479]: I0308 00:26:44.367474 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:45.369056 master-0 kubenswrapper[7479]: I0308 00:26:45.368965 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:45.369056 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:45.369056 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:45.369056 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:45.369743 master-0 kubenswrapper[7479]: I0308 00:26:45.369066 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:46.367713 master-0 kubenswrapper[7479]: I0308 00:26:46.367624 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:46.367713 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:46.367713 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:46.367713 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:46.368058 master-0 kubenswrapper[7479]: I0308 00:26:46.367724 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:47.367864 master-0 kubenswrapper[7479]: I0308 00:26:47.367780 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:47.367864 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:47.367864 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:47.367864 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:47.368907 master-0 kubenswrapper[7479]: I0308 00:26:47.367871 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:48.367153 master-0 kubenswrapper[7479]: I0308 00:26:48.367084 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:48.367153 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:48.367153 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:48.367153 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:48.367153 master-0 kubenswrapper[7479]: I0308 00:26:48.367148 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:49.367581 master-0 kubenswrapper[7479]: I0308 00:26:49.367517 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:49.367581 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:49.367581 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:49.367581 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:49.368273 master-0 kubenswrapper[7479]: I0308 00:26:49.367593 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:50.369229 master-0 kubenswrapper[7479]: I0308 00:26:50.369074 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:50.369229 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:50.369229 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:50.369229 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:50.369229 master-0 kubenswrapper[7479]: I0308 00:26:50.369180 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:51.329399 master-0 kubenswrapper[7479]: I0308 00:26:51.329343 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:51.329711 master-0 kubenswrapper[7479]: I0308 00:26:51.329693 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:26:51.368349 master-0 kubenswrapper[7479]: I0308 00:26:51.368285 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:51.368349 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:51.368349 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:51.368349 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:51.368634 master-0 kubenswrapper[7479]: I0308 00:26:51.368367 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:52.371305 master-0 kubenswrapper[7479]: I0308 00:26:52.371182 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:52.371305 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:52.371305 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:52.371305 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:52.372162 master-0 kubenswrapper[7479]: I0308 00:26:52.371344 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:53.368463 master-0 kubenswrapper[7479]: I0308 00:26:53.368357 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:53.368463 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:53.368463 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:53.368463 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:53.368463 master-0 kubenswrapper[7479]: I0308 00:26:53.368475 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:54.367734 master-0 kubenswrapper[7479]: I0308 00:26:54.367617 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:54.367734 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:54.367734 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:54.367734 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:54.367734 master-0 kubenswrapper[7479]: I0308 00:26:54.367688 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:55.370990 master-0 kubenswrapper[7479]: I0308 00:26:55.370928 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:55.370990 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:55.370990 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:55.370990 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:55.370990 master-0 kubenswrapper[7479]: I0308 00:26:55.371009 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:56.369027 master-0 kubenswrapper[7479]: I0308 00:26:56.368906 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:56.369027 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:56.369027 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:56.369027 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:56.369671 master-0 kubenswrapper[7479]: I0308 00:26:56.369060 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:57.370661 master-0 kubenswrapper[7479]: I0308 00:26:57.370565 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:57.370661 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:57.370661 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:57.370661 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:57.371860 master-0 kubenswrapper[7479]: I0308 00:26:57.370712 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:58.371367 master-0 kubenswrapper[7479]: I0308 00:26:58.371232 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:58.371367 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:58.371367 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:58.371367 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:58.372122 master-0 kubenswrapper[7479]: I0308 00:26:58.371433 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:26:59.367831 master-0 kubenswrapper[7479]: I0308 00:26:59.367718 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:26:59.367831 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:26:59.367831 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:26:59.367831 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:26:59.368652 master-0 kubenswrapper[7479]: I0308 00:26:59.367831 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:00.368501 master-0 kubenswrapper[7479]: I0308 00:27:00.368372 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:00.368501 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:00.368501 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:00.368501 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:00.369452 master-0 kubenswrapper[7479]: I0308 00:27:00.368518 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:01.368385 master-0 kubenswrapper[7479]: I0308 00:27:01.368292 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:01.368385 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:01.368385 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:01.368385 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:01.370002 master-0 kubenswrapper[7479]: I0308 00:27:01.369930 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:02.367811 master-0 kubenswrapper[7479]: I0308 00:27:02.367727 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:02.367811 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:02.367811 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:02.367811 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:02.367811 master-0 kubenswrapper[7479]: I0308 00:27:02.367812 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:03.368826 master-0 kubenswrapper[7479]: I0308 00:27:03.368659 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:03.368826 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:03.368826 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:03.368826 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:03.369997 master-0 kubenswrapper[7479]: I0308 00:27:03.368833 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:04.367416 master-0 kubenswrapper[7479]: I0308 00:27:04.367283 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:04.367416 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:04.367416 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:04.367416 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:04.368094 master-0 kubenswrapper[7479]: I0308 00:27:04.367432 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:05.368023 master-0 kubenswrapper[7479]: I0308 00:27:05.367954 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:05.368023 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:05.368023 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:05.368023 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:05.368023 master-0 kubenswrapper[7479]: I0308 00:27:05.368015 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:06.368312 master-0 kubenswrapper[7479]: I0308 00:27:06.368233 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:06.368312 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:06.368312 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:06.368312 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:06.368312 master-0 kubenswrapper[7479]: I0308 00:27:06.368295 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:07.368420 master-0 kubenswrapper[7479]: I0308 00:27:07.368330 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:07.368420 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:07.368420 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:07.368420 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:07.368420 master-0 kubenswrapper[7479]: I0308 00:27:07.368408 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:08.301811 master-0 kubenswrapper[7479]: I0308 00:27:08.301686 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:27:08.374680 master-0 kubenswrapper[7479]: I0308 00:27:08.370962 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:08.374680 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:08.374680 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:08.374680 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:08.374680 master-0 kubenswrapper[7479]: I0308 00:27:08.371068 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:09.369546 master-0 kubenswrapper[7479]: I0308 00:27:09.369453 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:09.369546 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:09.369546 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:09.369546 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:09.369842 master-0 kubenswrapper[7479]: I0308 00:27:09.369578 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:10.369557 master-0 kubenswrapper[7479]: I0308 00:27:10.369469 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:10.369557 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:10.369557 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:10.369557 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:10.369557 master-0 kubenswrapper[7479]: I0308 00:27:10.369549 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:11.337354 master-0 kubenswrapper[7479]: I0308 00:27:11.337257 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:27:11.342834 master-0 kubenswrapper[7479]: I0308 00:27:11.342770 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:27:11.371157 master-0 kubenswrapper[7479]: I0308 00:27:11.371085 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:11.371157 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:11.371157 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:11.371157 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:11.372586 master-0 kubenswrapper[7479]: I0308 00:27:11.372525 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:12.367081 master-0 kubenswrapper[7479]: I0308 00:27:12.366995 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:12.367081 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:12.367081 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:12.367081 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:12.367914 master-0 kubenswrapper[7479]: I0308 00:27:12.367847 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:13.368935 master-0 kubenswrapper[7479]: I0308 00:27:13.368857 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:13.368935 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:13.368935 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:13.368935 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:13.368935 master-0 kubenswrapper[7479]: I0308 00:27:13.368927 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:14.367694 master-0 kubenswrapper[7479]: I0308 00:27:14.367562 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:14.367694 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:14.367694 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:14.367694 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:14.368004 master-0 kubenswrapper[7479]: I0308 00:27:14.367705 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:15.369092 master-0 kubenswrapper[7479]: I0308 00:27:15.368998 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:15.369092 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:15.369092 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:15.369092 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:15.369092 master-0 kubenswrapper[7479]: I0308 00:27:15.369081 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:16.367409 master-0 kubenswrapper[7479]: I0308 00:27:16.367321 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:16.367409 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:16.367409 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:16.367409 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:16.367767 master-0 kubenswrapper[7479]: I0308 00:27:16.367439 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:17.367451 master-0 kubenswrapper[7479]: I0308 00:27:17.367375 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:17.367451 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:17.367451 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:17.367451 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:17.368194 master-0 kubenswrapper[7479]: I0308 00:27:17.367474 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:18.391975 master-0 kubenswrapper[7479]: I0308 00:27:18.391913 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:18.391975 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:18.391975 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:18.391975 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:18.392788 master-0 kubenswrapper[7479]: I0308 00:27:18.391983 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:19.367969 master-0 kubenswrapper[7479]: I0308 00:27:19.367798 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:19.367969 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:19.367969 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:19.367969 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:19.368828 master-0 kubenswrapper[7479]: I0308 00:27:19.367964 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:20.367234 master-0 kubenswrapper[7479]: I0308 00:27:20.367116 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:20.367234 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:20.367234 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:20.367234 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:20.367234 master-0 kubenswrapper[7479]: I0308 00:27:20.367178 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:21.367559 master-0 kubenswrapper[7479]: I0308 00:27:21.367491 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:21.367559 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:21.367559 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:21.367559 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:21.368161 master-0 kubenswrapper[7479]: I0308 00:27:21.367562 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:22.367957 master-0 kubenswrapper[7479]: I0308 00:27:22.367880 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:22.367957 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:22.367957 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:22.367957 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:22.367957 master-0 kubenswrapper[7479]: I0308 00:27:22.367955 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:23.367947 master-0 kubenswrapper[7479]: I0308 00:27:23.367848 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:23.367947 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:23.367947 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:23.367947 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:23.369182 master-0 kubenswrapper[7479]: I0308 00:27:23.367935 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:24.367834 master-0 kubenswrapper[7479]: I0308 00:27:24.367739 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:24.367834 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:24.367834 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:24.367834 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:24.368278 master-0 kubenswrapper[7479]: I0308 00:27:24.367847 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:25.367907 master-0 kubenswrapper[7479]: I0308 00:27:25.367815 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:25.367907 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:25.367907 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:25.367907 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:25.368901 master-0 kubenswrapper[7479]: I0308 00:27:25.367917 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:26.368660 master-0 kubenswrapper[7479]: I0308 00:27:26.368560 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:26.368660 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:26.368660 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:26.368660 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:26.368660 master-0 kubenswrapper[7479]: I0308 00:27:26.368649 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:27.368572 master-0 kubenswrapper[7479]: I0308 00:27:27.368465 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:27.368572 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:27.368572 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:27.368572 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:27.368572 master-0 kubenswrapper[7479]: I0308 00:27:27.368586 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:28.368808 master-0 kubenswrapper[7479]: I0308 00:27:28.368743 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:28.368808 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:28.368808 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:28.368808 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:28.369451 master-0 kubenswrapper[7479]: I0308 00:27:28.368810 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:29.367892 master-0 kubenswrapper[7479]: I0308 00:27:29.367828 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:29.367892 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:29.367892 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:29.367892 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:29.368288 master-0 kubenswrapper[7479]: I0308 00:27:29.367916 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:30.367006 master-0 kubenswrapper[7479]: I0308 00:27:30.366958 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:30.367006 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:30.367006 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:30.367006 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:30.367638 master-0 kubenswrapper[7479]: I0308 00:27:30.367030 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:31.367756 master-0 kubenswrapper[7479]: I0308 00:27:31.367693 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:31.367756 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:31.367756 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:31.367756 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:31.368752 master-0 kubenswrapper[7479]: I0308 00:27:31.367768 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:32.368023 master-0 kubenswrapper[7479]: I0308 00:27:32.367884 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:32.368023 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:32.368023 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:32.368023 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:32.368862 master-0 kubenswrapper[7479]: I0308 00:27:32.368060 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:33.367718 master-0 kubenswrapper[7479]: I0308 00:27:33.367643 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:33.367718 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:33.367718 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:33.367718 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:33.368016 master-0 kubenswrapper[7479]: I0308 00:27:33.367730 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:34.317146 master-0 kubenswrapper[7479]: I0308 00:27:34.317087 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/1.log" Mar 08 00:27:34.318662 master-0 kubenswrapper[7479]: I0308 00:27:34.318603 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/0.log" Mar 08 00:27:34.318734 master-0 kubenswrapper[7479]: I0308 00:27:34.318716 7479 generic.go:334] "Generic (PLEG): container finished" podID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" containerID="7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b" exitCode=1 Mar 08 00:27:34.318786 master-0 kubenswrapper[7479]: I0308 00:27:34.318753 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerDied","Data":"7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b"} Mar 08 00:27:34.318849 master-0 kubenswrapper[7479]: I0308 00:27:34.318812 7479 scope.go:117] "RemoveContainer" containerID="01f4711968edd90a03ce566521bccad3babf877143c30f69324972ce8a8bc2ae" Mar 08 00:27:34.321299 master-0 kubenswrapper[7479]: I0308 00:27:34.320675 7479 scope.go:117] "RemoveContainer" containerID="7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b" Mar 08 00:27:34.321299 master-0 kubenswrapper[7479]: E0308 00:27:34.321299 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-677db989d6-blw5x_openshift-ingress-operator(4d0b9fbc-a1f8-4a98-99de-758734bd1a5b)\"" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" podUID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" Mar 08 00:27:34.366750 master-0 kubenswrapper[7479]: I0308 00:27:34.366708 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:34.366750 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:34.366750 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:34.366750 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:34.366750 master-0 kubenswrapper[7479]: I0308 00:27:34.366761 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:35.328474 master-0 kubenswrapper[7479]: I0308 00:27:35.328417 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/1.log" Mar 08 00:27:35.366830 master-0 kubenswrapper[7479]: I0308 00:27:35.366778 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:35.366830 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:35.366830 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:35.366830 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:35.367190 master-0 kubenswrapper[7479]: I0308 00:27:35.366836 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:36.367905 master-0 kubenswrapper[7479]: I0308 00:27:36.367827 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:36.367905 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:36.367905 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:36.367905 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:36.367905 master-0 kubenswrapper[7479]: I0308 00:27:36.367901 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:37.367187 master-0 kubenswrapper[7479]: I0308 00:27:37.367112 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:37.367187 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:37.367187 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:37.367187 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:37.367486 master-0 kubenswrapper[7479]: I0308 00:27:37.367225 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:38.366839 master-0 kubenswrapper[7479]: I0308 00:27:38.366755 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:38.366839 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:38.366839 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:38.366839 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:38.367553 master-0 kubenswrapper[7479]: I0308 00:27:38.366854 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:39.366956 master-0 kubenswrapper[7479]: I0308 00:27:39.366897 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:39.366956 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:39.366956 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:39.366956 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:39.368266 master-0 kubenswrapper[7479]: I0308 00:27:39.368183 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:40.368287 master-0 kubenswrapper[7479]: I0308 00:27:40.368177 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:40.368287 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:40.368287 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:40.368287 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:40.368942 master-0 kubenswrapper[7479]: I0308 00:27:40.368329 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:41.368144 master-0 kubenswrapper[7479]: I0308 00:27:41.368096 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:41.368144 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:41.368144 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:41.368144 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:41.369156 master-0 kubenswrapper[7479]: I0308 00:27:41.368157 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:42.367971 master-0 kubenswrapper[7479]: I0308 00:27:42.367899 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:42.367971 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:42.367971 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:42.367971 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:42.368265 master-0 kubenswrapper[7479]: I0308 00:27:42.368002 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:43.366489 master-0 kubenswrapper[7479]: I0308 00:27:43.366373 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:43.366489 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:43.366489 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:43.366489 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:43.366489 master-0 kubenswrapper[7479]: I0308 00:27:43.366434 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:44.366807 master-0 kubenswrapper[7479]: I0308 00:27:44.366731 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:44.366807 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:44.366807 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:44.366807 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:44.367594 master-0 kubenswrapper[7479]: I0308 00:27:44.366835 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:45.369269 master-0 kubenswrapper[7479]: I0308 00:27:45.369132 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:45.369269 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:45.369269 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:45.369269 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:45.370378 master-0 kubenswrapper[7479]: I0308 00:27:45.369289 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:46.367925 master-0 kubenswrapper[7479]: I0308 00:27:46.367837 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:46.367925 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:46.367925 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:46.367925 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:46.368394 master-0 kubenswrapper[7479]: I0308 00:27:46.367933 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:47.366657 master-0 kubenswrapper[7479]: I0308 00:27:47.366615 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:47.366657 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:47.366657 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:47.366657 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:47.367393 master-0 kubenswrapper[7479]: I0308 00:27:47.366674 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:48.366941 master-0 kubenswrapper[7479]: I0308 00:27:48.366904 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:48.366941 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:48.366941 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:48.366941 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:48.367584 master-0 kubenswrapper[7479]: I0308 00:27:48.367560 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:48.885557 master-0 kubenswrapper[7479]: I0308 00:27:48.885476 7479 scope.go:117] "RemoveContainer" containerID="7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b" Mar 08 00:27:49.366524 master-0 kubenswrapper[7479]: I0308 00:27:49.366457 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:49.366524 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:49.366524 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:49.366524 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:49.366836 master-0 kubenswrapper[7479]: I0308 00:27:49.366544 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:49.437389 master-0 kubenswrapper[7479]: I0308 00:27:49.437340 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/1.log" Mar 08 00:27:49.438322 master-0 kubenswrapper[7479]: I0308 00:27:49.438279 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94"} Mar 08 00:27:50.368325 master-0 kubenswrapper[7479]: I0308 00:27:50.368250 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:50.368325 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:50.368325 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:50.368325 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:50.369042 master-0 kubenswrapper[7479]: I0308 00:27:50.368993 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:51.366320 master-0 kubenswrapper[7479]: I0308 00:27:51.366274 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:51.366320 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:51.366320 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:51.366320 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:51.366882 master-0 kubenswrapper[7479]: I0308 00:27:51.366329 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:52.370937 master-0 kubenswrapper[7479]: I0308 00:27:52.370883 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:52.370937 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:52.370937 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:52.370937 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:52.370937 master-0 kubenswrapper[7479]: I0308 00:27:52.370934 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:53.366980 master-0 kubenswrapper[7479]: I0308 00:27:53.366915 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:53.366980 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:53.366980 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:53.366980 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:53.367529 master-0 kubenswrapper[7479]: I0308 00:27:53.367001 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:54.368314 master-0 kubenswrapper[7479]: I0308 00:27:54.368220 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:54.368314 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:54.368314 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:54.368314 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:54.369997 master-0 kubenswrapper[7479]: I0308 00:27:54.368345 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:55.367018 master-0 kubenswrapper[7479]: I0308 00:27:55.366951 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:55.367018 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:55.367018 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:55.367018 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:55.367018 master-0 kubenswrapper[7479]: I0308 00:27:55.367012 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:56.367784 master-0 kubenswrapper[7479]: I0308 00:27:56.367727 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:56.367784 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:56.367784 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:56.367784 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:56.368429 master-0 kubenswrapper[7479]: I0308 00:27:56.367833 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:57.366462 master-0 kubenswrapper[7479]: I0308 00:27:57.366384 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:57.366462 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:57.366462 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:57.366462 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:57.366462 master-0 kubenswrapper[7479]: I0308 00:27:57.366440 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:58.367929 master-0 kubenswrapper[7479]: I0308 00:27:58.367851 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:58.367929 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:58.367929 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:58.367929 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:58.369219 master-0 kubenswrapper[7479]: I0308 00:27:58.367947 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:27:59.367416 master-0 kubenswrapper[7479]: I0308 00:27:59.367339 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:27:59.367416 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:27:59.367416 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:27:59.367416 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:27:59.369049 master-0 kubenswrapper[7479]: I0308 00:27:59.367423 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:00.367278 master-0 kubenswrapper[7479]: I0308 00:28:00.367214 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:00.367278 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:00.367278 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:00.367278 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:00.367634 master-0 kubenswrapper[7479]: I0308 00:28:00.367290 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:01.366950 master-0 kubenswrapper[7479]: I0308 00:28:01.366887 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:01.366950 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:01.366950 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:01.366950 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:01.367667 master-0 kubenswrapper[7479]: I0308 00:28:01.367012 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:02.366903 master-0 kubenswrapper[7479]: I0308 00:28:02.366825 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:02.366903 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:02.366903 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:02.366903 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:02.367480 master-0 kubenswrapper[7479]: I0308 00:28:02.366923 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:02.724381 master-0 kubenswrapper[7479]: I0308 00:28:02.724272 7479 patch_prober.go:28] interesting pod/machine-config-daemon-k7pnc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:28:02.724381 master-0 kubenswrapper[7479]: I0308 00:28:02.724336 7479 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" podUID="ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:28:03.367221 master-0 kubenswrapper[7479]: I0308 00:28:03.367160 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:03.367221 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:03.367221 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:03.367221 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:03.367844 master-0 kubenswrapper[7479]: I0308 00:28:03.367239 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:04.367803 master-0 kubenswrapper[7479]: I0308 00:28:04.367739 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:04.367803 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:04.367803 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:04.367803 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:04.368420 master-0 kubenswrapper[7479]: I0308 00:28:04.367818 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:05.366708 master-0 kubenswrapper[7479]: I0308 00:28:05.366602 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:05.366708 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:05.366708 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:05.366708 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:05.367086 master-0 kubenswrapper[7479]: I0308 00:28:05.366743 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:06.368919 master-0 kubenswrapper[7479]: I0308 00:28:06.368861 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:06.368919 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:06.368919 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:06.368919 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:06.369454 master-0 kubenswrapper[7479]: I0308 00:28:06.368929 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:07.367237 master-0 kubenswrapper[7479]: I0308 00:28:07.367158 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:07.367237 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:07.367237 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:07.367237 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:07.367596 master-0 kubenswrapper[7479]: I0308 00:28:07.367282 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:08.367617 master-0 kubenswrapper[7479]: I0308 00:28:08.367561 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:08.367617 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:08.367617 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:08.367617 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:08.368217 master-0 kubenswrapper[7479]: I0308 00:28:08.367631 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:09.367867 master-0 kubenswrapper[7479]: I0308 00:28:09.367793 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:09.367867 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:09.367867 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:09.367867 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:09.368395 master-0 kubenswrapper[7479]: I0308 00:28:09.367907 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:10.366693 master-0 kubenswrapper[7479]: I0308 00:28:10.366582 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:10.366693 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:10.366693 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:10.366693 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:10.367090 master-0 kubenswrapper[7479]: I0308 00:28:10.366697 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:11.367258 master-0 kubenswrapper[7479]: I0308 00:28:11.367188 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:11.367258 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:11.367258 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:11.367258 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:11.367974 master-0 kubenswrapper[7479]: I0308 00:28:11.367283 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:12.367050 master-0 kubenswrapper[7479]: I0308 00:28:12.367000 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:12.367050 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:12.367050 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:12.367050 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:12.367839 master-0 kubenswrapper[7479]: I0308 00:28:12.367807 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:13.367124 master-0 kubenswrapper[7479]: I0308 00:28:13.367027 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:13.367124 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:13.367124 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:13.367124 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:13.368519 master-0 kubenswrapper[7479]: I0308 00:28:13.368461 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:14.367720 master-0 kubenswrapper[7479]: I0308 00:28:14.367647 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:14.367720 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:14.367720 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:14.367720 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:14.368275 master-0 kubenswrapper[7479]: I0308 00:28:14.367757 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:15.367707 master-0 kubenswrapper[7479]: I0308 00:28:15.367603 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:15.367707 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:15.367707 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:15.367707 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:15.368309 master-0 kubenswrapper[7479]: I0308 00:28:15.367749 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:16.367773 master-0 kubenswrapper[7479]: I0308 00:28:16.367673 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:16.367773 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:16.367773 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:16.367773 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:16.368349 master-0 kubenswrapper[7479]: I0308 00:28:16.367812 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:17.367632 master-0 kubenswrapper[7479]: I0308 00:28:17.367565 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:17.367632 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:17.367632 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:17.367632 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:17.368400 master-0 kubenswrapper[7479]: I0308 00:28:17.367637 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:18.367536 master-0 kubenswrapper[7479]: I0308 00:28:18.367459 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:28:18.367536 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:28:18.367536 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:28:18.367536 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:28:18.367862 master-0 kubenswrapper[7479]: I0308 00:28:18.367553 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:28:18.367862 master-0 kubenswrapper[7479]: I0308 00:28:18.367614 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:28:18.368196 master-0 kubenswrapper[7479]: I0308 00:28:18.368154 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b"} pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" containerMessage="Container router failed startup probe, will be restarted" Mar 08 00:28:18.368592 master-0 kubenswrapper[7479]: I0308 00:28:18.368262 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" containerID="cri-o://915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" gracePeriod=3600 Mar 08 00:29:04.551262 master-0 kubenswrapper[7479]: E0308 00:29:04.551173 7479 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa7c49e_2a6e_4a4c_aa1e_e912eedd81c6.slice/crio-915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:29:05.219443 master-0 kubenswrapper[7479]: I0308 00:29:05.219250 7479 generic.go:334] "Generic (PLEG): container finished" podID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerID="915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" exitCode=0 Mar 08 00:29:05.219443 master-0 kubenswrapper[7479]: I0308 00:29:05.219341 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerDied","Data":"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b"} Mar 08 00:29:05.219443 master-0 kubenswrapper[7479]: I0308 00:29:05.219425 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d"} Mar 08 00:29:05.365776 master-0 kubenswrapper[7479]: I0308 00:29:05.365652 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:29:05.369503 master-0 kubenswrapper[7479]: I0308 00:29:05.369440 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:05.369503 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:05.369503 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:05.369503 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:05.369904 master-0 kubenswrapper[7479]: I0308 00:29:05.369518 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:06.368884 master-0 kubenswrapper[7479]: I0308 00:29:06.368773 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:06.368884 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:06.368884 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:06.368884 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:06.368884 master-0 kubenswrapper[7479]: I0308 00:29:06.368875 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:07.368100 master-0 kubenswrapper[7479]: I0308 00:29:07.368002 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:07.368100 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:07.368100 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:07.368100 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:07.368639 master-0 kubenswrapper[7479]: I0308 00:29:07.368110 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:08.364888 master-0 kubenswrapper[7479]: I0308 00:29:08.364782 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:29:08.369844 master-0 kubenswrapper[7479]: I0308 00:29:08.369752 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:08.369844 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:08.369844 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:08.369844 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:08.370228 master-0 kubenswrapper[7479]: I0308 00:29:08.369869 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:09.368755 master-0 kubenswrapper[7479]: I0308 00:29:09.368644 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:09.368755 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:09.368755 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:09.368755 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:09.369578 master-0 kubenswrapper[7479]: I0308 00:29:09.368794 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:10.369777 master-0 kubenswrapper[7479]: I0308 00:29:10.369672 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:10.369777 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:10.369777 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:10.369777 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:10.371288 master-0 kubenswrapper[7479]: I0308 00:29:10.369799 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:11.368315 master-0 kubenswrapper[7479]: I0308 00:29:11.368240 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:11.368315 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:11.368315 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:11.368315 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:11.368670 master-0 kubenswrapper[7479]: I0308 00:29:11.368342 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:12.369099 master-0 kubenswrapper[7479]: I0308 00:29:12.368982 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:12.369099 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:12.369099 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:12.369099 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:12.370285 master-0 kubenswrapper[7479]: I0308 00:29:12.369140 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:13.369986 master-0 kubenswrapper[7479]: I0308 00:29:13.369814 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:13.369986 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:13.369986 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:13.369986 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:13.369986 master-0 kubenswrapper[7479]: I0308 00:29:13.369908 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:14.368002 master-0 kubenswrapper[7479]: I0308 00:29:14.367932 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:14.368002 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:14.368002 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:14.368002 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:14.368516 master-0 kubenswrapper[7479]: I0308 00:29:14.368013 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:15.368570 master-0 kubenswrapper[7479]: I0308 00:29:15.368466 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:15.368570 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:15.368570 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:15.368570 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:15.369674 master-0 kubenswrapper[7479]: I0308 00:29:15.368587 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:16.368922 master-0 kubenswrapper[7479]: I0308 00:29:16.368816 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:16.368922 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:16.368922 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:16.368922 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:16.370423 master-0 kubenswrapper[7479]: I0308 00:29:16.368918 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:17.367898 master-0 kubenswrapper[7479]: I0308 00:29:17.367817 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:17.367898 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:17.367898 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:17.367898 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:17.368298 master-0 kubenswrapper[7479]: I0308 00:29:17.367912 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:18.371145 master-0 kubenswrapper[7479]: I0308 00:29:18.371073 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:18.371145 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:18.371145 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:18.371145 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:18.371826 master-0 kubenswrapper[7479]: I0308 00:29:18.371186 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:19.369102 master-0 kubenswrapper[7479]: I0308 00:29:19.368988 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:19.369102 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:19.369102 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:19.369102 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:19.369697 master-0 kubenswrapper[7479]: I0308 00:29:19.369111 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:20.368472 master-0 kubenswrapper[7479]: I0308 00:29:20.368331 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:20.368472 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:20.368472 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:20.368472 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:20.368472 master-0 kubenswrapper[7479]: I0308 00:29:20.368429 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:21.374309 master-0 kubenswrapper[7479]: I0308 00:29:21.368640 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:21.374309 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:21.374309 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:21.374309 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:21.374309 master-0 kubenswrapper[7479]: I0308 00:29:21.368748 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:22.371409 master-0 kubenswrapper[7479]: I0308 00:29:22.369366 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:22.371409 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:22.371409 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:22.371409 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:22.371409 master-0 kubenswrapper[7479]: I0308 00:29:22.369457 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:23.367897 master-0 kubenswrapper[7479]: I0308 00:29:23.367776 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:23.367897 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:23.367897 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:23.367897 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:23.367897 master-0 kubenswrapper[7479]: I0308 00:29:23.367877 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:24.370113 master-0 kubenswrapper[7479]: I0308 00:29:24.370045 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:24.370113 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:24.370113 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:24.370113 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:24.370113 master-0 kubenswrapper[7479]: I0308 00:29:24.370123 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:24.992543 master-0 kubenswrapper[7479]: I0308 00:29:24.992447 7479 scope.go:117] "RemoveContainer" containerID="0fe11e31bc3fff8b9610286a4d61bcdc774b24a696a35e7bd68af0798051cd1f" Mar 08 00:29:25.366994 master-0 kubenswrapper[7479]: I0308 00:29:25.366896 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:25.366994 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:25.366994 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:25.366994 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:25.367469 master-0 kubenswrapper[7479]: I0308 00:29:25.367034 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:26.369267 master-0 kubenswrapper[7479]: I0308 00:29:26.369185 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:26.369267 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:26.369267 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:26.369267 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:26.370116 master-0 kubenswrapper[7479]: I0308 00:29:26.370079 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:27.367878 master-0 kubenswrapper[7479]: I0308 00:29:27.367779 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:27.367878 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:27.367878 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:27.367878 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:27.368494 master-0 kubenswrapper[7479]: I0308 00:29:27.367899 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:28.370065 master-0 kubenswrapper[7479]: I0308 00:29:28.369964 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:28.370065 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:28.370065 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:28.370065 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:28.372011 master-0 kubenswrapper[7479]: I0308 00:29:28.370076 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:29.368454 master-0 kubenswrapper[7479]: I0308 00:29:29.368379 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:29.368454 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:29.368454 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:29.368454 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:29.368454 master-0 kubenswrapper[7479]: I0308 00:29:29.368455 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:30.372175 master-0 kubenswrapper[7479]: I0308 00:29:30.372079 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:30.372175 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:30.372175 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:30.372175 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:30.373545 master-0 kubenswrapper[7479]: I0308 00:29:30.372238 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:31.369127 master-0 kubenswrapper[7479]: I0308 00:29:31.369045 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:31.369127 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:31.369127 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:31.369127 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:31.369673 master-0 kubenswrapper[7479]: I0308 00:29:31.369162 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:32.368304 master-0 kubenswrapper[7479]: I0308 00:29:32.368173 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:32.368304 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:32.368304 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:32.368304 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:32.369090 master-0 kubenswrapper[7479]: I0308 00:29:32.368324 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:33.368532 master-0 kubenswrapper[7479]: I0308 00:29:33.368433 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:33.368532 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:33.368532 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:33.368532 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:33.369271 master-0 kubenswrapper[7479]: I0308 00:29:33.368572 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:34.369404 master-0 kubenswrapper[7479]: I0308 00:29:34.369305 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:34.369404 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:34.369404 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:34.369404 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:34.370595 master-0 kubenswrapper[7479]: I0308 00:29:34.369426 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:35.368922 master-0 kubenswrapper[7479]: I0308 00:29:35.368774 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:35.368922 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:35.368922 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:35.368922 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:35.368922 master-0 kubenswrapper[7479]: I0308 00:29:35.368911 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:36.369072 master-0 kubenswrapper[7479]: I0308 00:29:36.368968 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:36.369072 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:36.369072 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:36.369072 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:36.370265 master-0 kubenswrapper[7479]: I0308 00:29:36.369089 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:37.367402 master-0 kubenswrapper[7479]: I0308 00:29:37.367293 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:37.367402 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:37.367402 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:37.367402 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:37.367742 master-0 kubenswrapper[7479]: I0308 00:29:37.367463 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:38.368311 master-0 kubenswrapper[7479]: I0308 00:29:38.368258 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:38.368311 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:38.368311 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:38.368311 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:38.368903 master-0 kubenswrapper[7479]: I0308 00:29:38.368341 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:39.368974 master-0 kubenswrapper[7479]: I0308 00:29:39.368824 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:39.368974 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:39.368974 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:39.368974 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:39.368974 master-0 kubenswrapper[7479]: I0308 00:29:39.368939 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:40.368074 master-0 kubenswrapper[7479]: I0308 00:29:40.368029 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:40.368074 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:40.368074 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:40.368074 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:40.368502 master-0 kubenswrapper[7479]: I0308 00:29:40.368474 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:41.369043 master-0 kubenswrapper[7479]: I0308 00:29:41.368931 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:41.369043 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:41.369043 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:41.369043 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:41.369967 master-0 kubenswrapper[7479]: I0308 00:29:41.369091 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:42.367062 master-0 kubenswrapper[7479]: I0308 00:29:42.366988 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:42.367062 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:42.367062 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:42.367062 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:42.367452 master-0 kubenswrapper[7479]: I0308 00:29:42.367093 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:43.368785 master-0 kubenswrapper[7479]: I0308 00:29:43.368701 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:43.368785 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:43.368785 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:43.368785 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:43.368785 master-0 kubenswrapper[7479]: I0308 00:29:43.368791 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:44.368094 master-0 kubenswrapper[7479]: I0308 00:29:44.368022 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:44.368094 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:44.368094 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:44.368094 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:44.368498 master-0 kubenswrapper[7479]: I0308 00:29:44.368103 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:45.368372 master-0 kubenswrapper[7479]: I0308 00:29:45.368274 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:45.368372 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:45.368372 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:45.368372 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:45.369796 master-0 kubenswrapper[7479]: I0308 00:29:45.368403 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:46.369100 master-0 kubenswrapper[7479]: I0308 00:29:46.369026 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:46.369100 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:46.369100 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:46.369100 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:46.369100 master-0 kubenswrapper[7479]: I0308 00:29:46.369107 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:47.368510 master-0 kubenswrapper[7479]: I0308 00:29:47.368432 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:47.368510 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:47.368510 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:47.368510 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:47.369135 master-0 kubenswrapper[7479]: I0308 00:29:47.369089 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:48.368641 master-0 kubenswrapper[7479]: I0308 00:29:48.368572 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:48.368641 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:48.368641 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:48.368641 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:48.369100 master-0 kubenswrapper[7479]: I0308 00:29:48.368666 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:49.366906 master-0 kubenswrapper[7479]: I0308 00:29:49.366817 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:49.366906 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:49.366906 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:49.366906 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:49.367814 master-0 kubenswrapper[7479]: I0308 00:29:49.366916 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:50.368189 master-0 kubenswrapper[7479]: I0308 00:29:50.368020 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:50.368189 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:50.368189 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:50.368189 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:50.369444 master-0 kubenswrapper[7479]: I0308 00:29:50.368246 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:50.586103 master-0 kubenswrapper[7479]: I0308 00:29:50.586000 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/2.log" Mar 08 00:29:50.586707 master-0 kubenswrapper[7479]: I0308 00:29:50.586646 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/1.log" Mar 08 00:29:50.587244 master-0 kubenswrapper[7479]: I0308 00:29:50.587131 7479 generic.go:334] "Generic (PLEG): container finished" podID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" containerID="9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94" exitCode=1 Mar 08 00:29:50.587393 master-0 kubenswrapper[7479]: I0308 00:29:50.587237 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerDied","Data":"9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94"} Mar 08 00:29:50.587393 master-0 kubenswrapper[7479]: I0308 00:29:50.587359 7479 scope.go:117] "RemoveContainer" containerID="7e2036023e6d81478f29926d6902ad0782c672516fb8dbd568498926e21d680b" Mar 08 00:29:50.588289 master-0 kubenswrapper[7479]: I0308 00:29:50.588179 7479 scope.go:117] "RemoveContainer" containerID="9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94" Mar 08 00:29:50.588604 master-0 kubenswrapper[7479]: E0308 00:29:50.588537 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-677db989d6-blw5x_openshift-ingress-operator(4d0b9fbc-a1f8-4a98-99de-758734bd1a5b)\"" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" podUID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" Mar 08 00:29:51.367669 master-0 kubenswrapper[7479]: I0308 00:29:51.367536 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:51.367669 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:51.367669 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:51.367669 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:51.368283 master-0 kubenswrapper[7479]: I0308 00:29:51.367681 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:51.594428 master-0 kubenswrapper[7479]: I0308 00:29:51.594355 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/2.log" Mar 08 00:29:52.367573 master-0 kubenswrapper[7479]: I0308 00:29:52.367490 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:52.367573 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:52.367573 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:52.367573 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:52.367973 master-0 kubenswrapper[7479]: I0308 00:29:52.367580 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:53.367311 master-0 kubenswrapper[7479]: I0308 00:29:53.367254 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:53.367311 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:53.367311 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:53.367311 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:53.367963 master-0 kubenswrapper[7479]: I0308 00:29:53.367331 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:54.367674 master-0 kubenswrapper[7479]: I0308 00:29:54.367569 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:54.367674 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:54.367674 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:54.367674 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:54.367674 master-0 kubenswrapper[7479]: I0308 00:29:54.367673 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:55.368705 master-0 kubenswrapper[7479]: I0308 00:29:55.368590 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:55.368705 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:55.368705 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:55.368705 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:55.369439 master-0 kubenswrapper[7479]: I0308 00:29:55.368732 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:56.367302 master-0 kubenswrapper[7479]: I0308 00:29:56.367223 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:56.367302 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:56.367302 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:56.367302 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:56.367620 master-0 kubenswrapper[7479]: I0308 00:29:56.367305 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:57.367325 master-0 kubenswrapper[7479]: I0308 00:29:57.367257 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:57.367325 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:57.367325 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:57.367325 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:57.368054 master-0 kubenswrapper[7479]: I0308 00:29:57.367345 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:58.367918 master-0 kubenswrapper[7479]: I0308 00:29:58.367833 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:58.367918 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:58.367918 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:58.367918 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:58.369169 master-0 kubenswrapper[7479]: I0308 00:29:58.369117 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:29:59.368314 master-0 kubenswrapper[7479]: I0308 00:29:59.368224 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:29:59.368314 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:29:59.368314 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:29:59.368314 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:29:59.369454 master-0 kubenswrapper[7479]: I0308 00:29:59.368350 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:00.366641 master-0 kubenswrapper[7479]: I0308 00:30:00.366582 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:00.366641 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:00.366641 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:00.366641 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:00.366986 master-0 kubenswrapper[7479]: I0308 00:30:00.366649 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:01.366316 master-0 kubenswrapper[7479]: I0308 00:30:01.366194 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:01.366316 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:01.366316 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:01.366316 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:01.366316 master-0 kubenswrapper[7479]: I0308 00:30:01.366326 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:02.368831 master-0 kubenswrapper[7479]: I0308 00:30:02.368408 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:02.368831 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:02.368831 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:02.368831 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:02.368831 master-0 kubenswrapper[7479]: I0308 00:30:02.368519 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:03.367784 master-0 kubenswrapper[7479]: I0308 00:30:03.367707 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:03.367784 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:03.367784 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:03.367784 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:03.368319 master-0 kubenswrapper[7479]: I0308 00:30:03.367820 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:04.367566 master-0 kubenswrapper[7479]: I0308 00:30:04.367467 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:04.367566 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:04.367566 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:04.367566 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:04.367566 master-0 kubenswrapper[7479]: I0308 00:30:04.367563 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:04.885561 master-0 kubenswrapper[7479]: I0308 00:30:04.885472 7479 scope.go:117] "RemoveContainer" containerID="9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94" Mar 08 00:30:04.885912 master-0 kubenswrapper[7479]: E0308 00:30:04.885742 7479 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-677db989d6-blw5x_openshift-ingress-operator(4d0b9fbc-a1f8-4a98-99de-758734bd1a5b)\"" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" podUID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" Mar 08 00:30:05.367248 master-0 kubenswrapper[7479]: I0308 00:30:05.367131 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:05.367248 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:05.367248 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:05.367248 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:05.367248 master-0 kubenswrapper[7479]: I0308 00:30:05.367230 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:06.367126 master-0 kubenswrapper[7479]: I0308 00:30:06.367013 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:06.367126 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:06.367126 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:06.367126 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:06.368241 master-0 kubenswrapper[7479]: I0308 00:30:06.367144 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:07.367753 master-0 kubenswrapper[7479]: I0308 00:30:07.367659 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:07.367753 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:07.367753 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:07.367753 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:07.368915 master-0 kubenswrapper[7479]: I0308 00:30:07.367787 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:08.367792 master-0 kubenswrapper[7479]: I0308 00:30:08.367720 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:08.367792 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:08.367792 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:08.367792 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:08.368419 master-0 kubenswrapper[7479]: I0308 00:30:08.367816 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:09.368192 master-0 kubenswrapper[7479]: I0308 00:30:09.368074 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:09.368192 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:09.368192 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:09.368192 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:09.369622 master-0 kubenswrapper[7479]: I0308 00:30:09.368276 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:10.368401 master-0 kubenswrapper[7479]: I0308 00:30:10.368151 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:10.368401 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:10.368401 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:10.368401 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:10.369062 master-0 kubenswrapper[7479]: I0308 00:30:10.368463 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:11.368745 master-0 kubenswrapper[7479]: I0308 00:30:11.368641 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:11.368745 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:11.368745 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:11.368745 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:11.370048 master-0 kubenswrapper[7479]: I0308 00:30:11.368764 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:12.368853 master-0 kubenswrapper[7479]: I0308 00:30:12.368737 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:12.368853 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:12.368853 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:12.368853 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:12.369991 master-0 kubenswrapper[7479]: I0308 00:30:12.368887 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:13.367734 master-0 kubenswrapper[7479]: I0308 00:30:13.367595 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:13.367734 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:13.367734 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:13.367734 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:13.368379 master-0 kubenswrapper[7479]: I0308 00:30:13.367737 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:14.370308 master-0 kubenswrapper[7479]: I0308 00:30:14.370228 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:14.370308 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:14.370308 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:14.370308 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:14.370986 master-0 kubenswrapper[7479]: I0308 00:30:14.370347 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:15.369261 master-0 kubenswrapper[7479]: I0308 00:30:15.369129 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:15.369261 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:15.369261 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:15.369261 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:15.369971 master-0 kubenswrapper[7479]: I0308 00:30:15.369338 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:16.368940 master-0 kubenswrapper[7479]: I0308 00:30:16.368796 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:16.368940 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:16.368940 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:16.368940 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:16.369883 master-0 kubenswrapper[7479]: I0308 00:30:16.369007 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:17.367165 master-0 kubenswrapper[7479]: I0308 00:30:17.367047 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:17.367165 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:17.367165 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:17.367165 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:17.367533 master-0 kubenswrapper[7479]: I0308 00:30:17.367173 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:18.368293 master-0 kubenswrapper[7479]: I0308 00:30:18.368145 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:18.368293 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:18.368293 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:18.368293 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:18.368912 master-0 kubenswrapper[7479]: I0308 00:30:18.368349 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:19.368078 master-0 kubenswrapper[7479]: I0308 00:30:19.367955 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:19.368078 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:19.368078 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:19.368078 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:19.368078 master-0 kubenswrapper[7479]: I0308 00:30:19.368049 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:19.885003 master-0 kubenswrapper[7479]: I0308 00:30:19.884940 7479 scope.go:117] "RemoveContainer" containerID="9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94" Mar 08 00:30:20.369697 master-0 kubenswrapper[7479]: I0308 00:30:20.369583 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:20.369697 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:20.369697 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:20.369697 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:20.370669 master-0 kubenswrapper[7479]: I0308 00:30:20.370624 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:20.815852 master-0 kubenswrapper[7479]: I0308 00:30:20.815784 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/2.log" Mar 08 00:30:20.816630 master-0 kubenswrapper[7479]: I0308 00:30:20.816588 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"acd8ffd92596b76a588170b227f4d7ab4a872868344965430ac8c8d78ec037e1"} Mar 08 00:30:21.367665 master-0 kubenswrapper[7479]: I0308 00:30:21.367578 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:21.367665 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:21.367665 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:21.367665 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:21.367665 master-0 kubenswrapper[7479]: I0308 00:30:21.367644 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:22.368076 master-0 kubenswrapper[7479]: I0308 00:30:22.368016 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:22.368076 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:22.368076 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:22.368076 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:22.368850 master-0 kubenswrapper[7479]: I0308 00:30:22.368090 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:23.368508 master-0 kubenswrapper[7479]: I0308 00:30:23.368430 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:23.368508 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:23.368508 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:23.368508 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:23.369218 master-0 kubenswrapper[7479]: I0308 00:30:23.368534 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:24.367672 master-0 kubenswrapper[7479]: I0308 00:30:24.367606 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:24.367672 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:24.367672 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:24.367672 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:24.367999 master-0 kubenswrapper[7479]: I0308 00:30:24.367762 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:25.367384 master-0 kubenswrapper[7479]: I0308 00:30:25.367288 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:25.367384 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:25.367384 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:25.367384 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:25.368154 master-0 kubenswrapper[7479]: I0308 00:30:25.367425 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:26.368355 master-0 kubenswrapper[7479]: I0308 00:30:26.368270 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:26.368355 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:26.368355 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:26.368355 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:26.369006 master-0 kubenswrapper[7479]: I0308 00:30:26.368386 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:27.367901 master-0 kubenswrapper[7479]: I0308 00:30:27.367815 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:27.367901 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:27.367901 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:27.367901 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:27.368398 master-0 kubenswrapper[7479]: I0308 00:30:27.367907 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:28.370715 master-0 kubenswrapper[7479]: I0308 00:30:28.367430 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:28.370715 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:28.370715 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:28.370715 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:28.370715 master-0 kubenswrapper[7479]: I0308 00:30:28.367517 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:28.812853 master-0 kubenswrapper[7479]: I0308 00:30:28.812799 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-85ss7"] Mar 08 00:30:28.814099 master-0 kubenswrapper[7479]: I0308 00:30:28.814078 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:28.817737 master-0 kubenswrapper[7479]: I0308 00:30:28.816838 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-8pw5w" Mar 08 00:30:28.823523 master-0 kubenswrapper[7479]: I0308 00:30:28.823468 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 08 00:30:28.907562 master-0 kubenswrapper[7479]: I0308 00:30:28.907498 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:28.908009 master-0 kubenswrapper[7479]: I0308 00:30:28.907960 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:28.908172 master-0 kubenswrapper[7479]: I0308 00:30:28.908151 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:28.908387 master-0 kubenswrapper[7479]: I0308 00:30:28.908361 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmrr\" (UniqueName: \"kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.010009 master-0 kubenswrapper[7479]: I0308 00:30:29.009945 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.010689 master-0 kubenswrapper[7479]: I0308 00:30:29.010521 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.010989 master-0 kubenswrapper[7479]: I0308 00:30:29.010958 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.011306 master-0 kubenswrapper[7479]: I0308 00:30:29.011255 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmrr\" (UniqueName: \"kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.011627 master-0 kubenswrapper[7479]: I0308 00:30:29.011017 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.011804 master-0 kubenswrapper[7479]: I0308 00:30:29.010646 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.012587 master-0 kubenswrapper[7479]: I0308 00:30:29.012518 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.047105 master-0 kubenswrapper[7479]: I0308 00:30:29.046612 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmrr\" (UniqueName: \"kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr\") pod \"cni-sysctl-allowlist-ds-85ss7\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.139869 master-0 kubenswrapper[7479]: I0308 00:30:29.139693 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:29.368715 master-0 kubenswrapper[7479]: I0308 00:30:29.368649 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:29.368715 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:29.368715 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:29.368715 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:29.369176 master-0 kubenswrapper[7479]: I0308 00:30:29.368733 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:29.903673 master-0 kubenswrapper[7479]: I0308 00:30:29.903573 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" event={"ID":"dd44866c-687b-4dc5-a585-6cd610ebd429","Type":"ContainerStarted","Data":"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7"} Mar 08 00:30:29.903673 master-0 kubenswrapper[7479]: I0308 00:30:29.903629 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" event={"ID":"dd44866c-687b-4dc5-a585-6cd610ebd429","Type":"ContainerStarted","Data":"92be0fe5d52bce56af2d4df91e9265657ad1c5f76e4916da41a3e946c24abbfe"} Mar 08 00:30:29.904883 master-0 kubenswrapper[7479]: I0308 00:30:29.904829 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:30.204504 master-0 kubenswrapper[7479]: I0308 00:30:30.204175 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" podStartSLOduration=2.204122936 podStartE2EDuration="2.204122936s" podCreationTimestamp="2026-03-08 00:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:30:30.197300689 +0000 UTC m=+546.510209616" watchObservedRunningTime="2026-03-08 00:30:30.204122936 +0000 UTC m=+546.517031883" Mar 08 00:30:30.368247 master-0 kubenswrapper[7479]: I0308 00:30:30.368150 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:30.368247 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:30.368247 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:30.368247 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:30.368247 master-0 kubenswrapper[7479]: I0308 00:30:30.368255 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:30.940596 master-0 kubenswrapper[7479]: I0308 00:30:30.940450 7479 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:30:31.368223 master-0 kubenswrapper[7479]: I0308 00:30:31.368140 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:31.368223 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:31.368223 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:31.368223 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:31.369368 master-0 kubenswrapper[7479]: I0308 00:30:31.368251 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:31.743604 master-0 kubenswrapper[7479]: I0308 00:30:31.743395 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-85ss7"] Mar 08 00:30:32.368702 master-0 kubenswrapper[7479]: I0308 00:30:32.368609 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:32.368702 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:32.368702 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:32.368702 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:32.368702 master-0 kubenswrapper[7479]: I0308 00:30:32.368711 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:32.928097 master-0 kubenswrapper[7479]: I0308 00:30:32.928013 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" gracePeriod=30 Mar 08 00:30:33.367341 master-0 kubenswrapper[7479]: I0308 00:30:33.367261 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:33.367341 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:33.367341 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:33.367341 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:33.367788 master-0 kubenswrapper[7479]: I0308 00:30:33.367377 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:34.368376 master-0 kubenswrapper[7479]: I0308 00:30:34.368236 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:34.368376 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:34.368376 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:34.368376 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:34.368376 master-0 kubenswrapper[7479]: I0308 00:30:34.368365 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:35.367778 master-0 kubenswrapper[7479]: I0308 00:30:35.367689 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:35.367778 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:35.367778 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:35.367778 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:35.368319 master-0 kubenswrapper[7479]: I0308 00:30:35.367800 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:36.367938 master-0 kubenswrapper[7479]: I0308 00:30:36.367832 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:36.367938 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:36.367938 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:36.367938 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:36.368410 master-0 kubenswrapper[7479]: I0308 00:30:36.367952 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:37.368689 master-0 kubenswrapper[7479]: I0308 00:30:37.368604 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:37.368689 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:37.368689 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:37.368689 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:37.368689 master-0 kubenswrapper[7479]: I0308 00:30:37.368667 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:38.372031 master-0 kubenswrapper[7479]: I0308 00:30:38.371880 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:38.372031 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:38.372031 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:38.372031 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:38.372031 master-0 kubenswrapper[7479]: I0308 00:30:38.371981 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:38.377466 master-0 kubenswrapper[7479]: I0308 00:30:38.377406 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-5n69x"] Mar 08 00:30:38.378549 master-0 kubenswrapper[7479]: I0308 00:30:38.378513 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.387659 master-0 kubenswrapper[7479]: I0308 00:30:38.387611 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-8nl72" Mar 08 00:30:38.394090 master-0 kubenswrapper[7479]: I0308 00:30:38.394022 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-5n69x"] Mar 08 00:30:38.469703 master-0 kubenswrapper[7479]: I0308 00:30:38.469646 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.470084 master-0 kubenswrapper[7479]: I0308 00:30:38.470056 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txt48\" (UniqueName: \"kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.571958 master-0 kubenswrapper[7479]: I0308 00:30:38.571909 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txt48\" (UniqueName: \"kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.572279 master-0 kubenswrapper[7479]: I0308 00:30:38.572264 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.575912 master-0 kubenswrapper[7479]: I0308 00:30:38.575858 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.598301 master-0 kubenswrapper[7479]: I0308 00:30:38.598240 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txt48\" (UniqueName: \"kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:38.717869 master-0 kubenswrapper[7479]: I0308 00:30:38.717677 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:30:39.144562 master-0 kubenswrapper[7479]: E0308 00:30:39.144485 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:39.148005 master-0 kubenswrapper[7479]: E0308 00:30:39.147929 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:39.155315 master-0 kubenswrapper[7479]: E0308 00:30:39.150121 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:39.155315 master-0 kubenswrapper[7479]: E0308 00:30:39.150249 7479 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:30:39.224334 master-0 kubenswrapper[7479]: I0308 00:30:39.223915 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-5n69x"] Mar 08 00:30:39.231865 master-0 kubenswrapper[7479]: W0308 00:30:39.231803 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da0c222_4b59_424f_9817_48673083df00.slice/crio-3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494 WatchSource:0}: Error finding container 3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494: Status 404 returned error can't find the container with id 3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494 Mar 08 00:30:39.368003 master-0 kubenswrapper[7479]: I0308 00:30:39.367905 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:39.368003 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:39.368003 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:39.368003 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:39.368570 master-0 kubenswrapper[7479]: I0308 00:30:39.368015 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:39.996948 master-0 kubenswrapper[7479]: I0308 00:30:39.996852 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"9e323e1fa7d402b7efb1afca10f5c1139ebd69bd1d0ac77477dbe3652009da9a"} Mar 08 00:30:39.997646 master-0 kubenswrapper[7479]: I0308 00:30:39.996959 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"9a99ff1aaac045cf6ae25e4ec837d836f7e6fbd938939ae00536d553ed630283"} Mar 08 00:30:39.997646 master-0 kubenswrapper[7479]: I0308 00:30:39.996995 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494"} Mar 08 00:30:40.061411 master-0 kubenswrapper[7479]: I0308 00:30:40.061176 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" podStartSLOduration=2.061122726 podStartE2EDuration="2.061122726s" podCreationTimestamp="2026-03-08 00:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:30:40.059250088 +0000 UTC m=+556.372159065" watchObservedRunningTime="2026-03-08 00:30:40.061122726 +0000 UTC m=+556.374031683" Mar 08 00:30:40.134760 master-0 kubenswrapper[7479]: I0308 00:30:40.134691 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:30:40.135166 master-0 kubenswrapper[7479]: I0308 00:30:40.135104 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="multus-admission-controller" containerID="cri-o://c54ec75e7b215135d97163ba8f315624435a019aae1bb5d4becc779b33de3782" gracePeriod=30 Mar 08 00:30:40.135354 master-0 kubenswrapper[7479]: I0308 00:30:40.135274 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="kube-rbac-proxy" containerID="cri-o://b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768" gracePeriod=30 Mar 08 00:30:40.206103 master-0 kubenswrapper[7479]: E0308 00:30:40.206060 7479 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a0bdcc_92f5_41e6_ab47_ee48a5788bac.slice/crio-b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:30:40.366868 master-0 kubenswrapper[7479]: I0308 00:30:40.366801 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:40.366868 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:40.366868 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:40.366868 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:40.367271 master-0 kubenswrapper[7479]: I0308 00:30:40.366886 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:41.005679 master-0 kubenswrapper[7479]: I0308 00:30:41.005605 7479 generic.go:334] "Generic (PLEG): container finished" podID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerID="b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768" exitCode=0 Mar 08 00:30:41.006447 master-0 kubenswrapper[7479]: I0308 00:30:41.005687 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerDied","Data":"b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768"} Mar 08 00:30:41.368596 master-0 kubenswrapper[7479]: I0308 00:30:41.368395 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:41.368596 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:41.368596 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:41.368596 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:41.368596 master-0 kubenswrapper[7479]: I0308 00:30:41.368503 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:42.367417 master-0 kubenswrapper[7479]: I0308 00:30:42.367339 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:42.367417 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:42.367417 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:42.367417 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:42.368176 master-0 kubenswrapper[7479]: I0308 00:30:42.367481 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:43.368729 master-0 kubenswrapper[7479]: I0308 00:30:43.368582 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:43.368729 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:43.368729 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:43.368729 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:43.368729 master-0 kubenswrapper[7479]: I0308 00:30:43.368683 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:44.367260 master-0 kubenswrapper[7479]: I0308 00:30:44.367152 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:44.367260 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:44.367260 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:44.367260 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:44.367695 master-0 kubenswrapper[7479]: I0308 00:30:44.367667 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:45.368302 master-0 kubenswrapper[7479]: I0308 00:30:45.368219 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:45.368302 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:45.368302 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:45.368302 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:45.369121 master-0 kubenswrapper[7479]: I0308 00:30:45.368324 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:46.368807 master-0 kubenswrapper[7479]: I0308 00:30:46.368722 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:46.368807 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:46.368807 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:46.368807 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:46.370086 master-0 kubenswrapper[7479]: I0308 00:30:46.370035 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:47.367885 master-0 kubenswrapper[7479]: I0308 00:30:47.367799 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:47.367885 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:47.367885 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:47.367885 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:47.367885 master-0 kubenswrapper[7479]: I0308 00:30:47.367880 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:48.028309 master-0 kubenswrapper[7479]: I0308 00:30:48.028234 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:30:48.029458 master-0 kubenswrapper[7479]: I0308 00:30:48.029428 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.032263 master-0 kubenswrapper[7479]: I0308 00:30:48.032221 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-4dxb7" Mar 08 00:30:48.032336 master-0 kubenswrapper[7479]: I0308 00:30:48.032226 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:30:48.049839 master-0 kubenswrapper[7479]: I0308 00:30:48.049764 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:30:48.055029 master-0 kubenswrapper[7479]: I0308 00:30:48.054978 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.055189 master-0 kubenswrapper[7479]: I0308 00:30:48.055068 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.055434 master-0 kubenswrapper[7479]: I0308 00:30:48.055390 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.157421 master-0 kubenswrapper[7479]: I0308 00:30:48.157334 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.157680 master-0 kubenswrapper[7479]: I0308 00:30:48.157629 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.157717 master-0 kubenswrapper[7479]: I0308 00:30:48.157692 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.157826 master-0 kubenswrapper[7479]: I0308 00:30:48.157758 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.157883 master-0 kubenswrapper[7479]: I0308 00:30:48.157811 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.174772 master-0 kubenswrapper[7479]: I0308 00:30:48.174711 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.358743 master-0 kubenswrapper[7479]: I0308 00:30:48.355019 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:30:48.370856 master-0 kubenswrapper[7479]: I0308 00:30:48.370773 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:48.370856 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:48.370856 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:48.370856 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:48.371144 master-0 kubenswrapper[7479]: I0308 00:30:48.370899 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:48.828452 master-0 kubenswrapper[7479]: I0308 00:30:48.828375 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:30:48.840930 master-0 kubenswrapper[7479]: W0308 00:30:48.840860 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod66915251_1fdd_40f3_a59b_054776b214df.slice/crio-c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef WatchSource:0}: Error finding container c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef: Status 404 returned error can't find the container with id c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef Mar 08 00:30:49.068646 master-0 kubenswrapper[7479]: I0308 00:30:49.068591 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerStarted","Data":"c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef"} Mar 08 00:30:49.142924 master-0 kubenswrapper[7479]: E0308 00:30:49.142851 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:49.144708 master-0 kubenswrapper[7479]: E0308 00:30:49.144666 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:49.146028 master-0 kubenswrapper[7479]: E0308 00:30:49.145994 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:49.146113 master-0 kubenswrapper[7479]: E0308 00:30:49.146030 7479 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:30:49.368789 master-0 kubenswrapper[7479]: I0308 00:30:49.368621 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:49.368789 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:49.368789 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:49.368789 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:49.368789 master-0 kubenswrapper[7479]: I0308 00:30:49.368769 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:50.077093 master-0 kubenswrapper[7479]: I0308 00:30:50.077019 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerStarted","Data":"d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc"} Mar 08 00:30:50.096804 master-0 kubenswrapper[7479]: I0308 00:30:50.096695 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=2.096668436 podStartE2EDuration="2.096668436s" podCreationTimestamp="2026-03-08 00:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:30:50.093270727 +0000 UTC m=+566.406179714" watchObservedRunningTime="2026-03-08 00:30:50.096668436 +0000 UTC m=+566.409577343" Mar 08 00:30:50.369248 master-0 kubenswrapper[7479]: I0308 00:30:50.369052 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:50.369248 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:50.369248 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:50.369248 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:50.369248 master-0 kubenswrapper[7479]: I0308 00:30:50.369121 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:51.367689 master-0 kubenswrapper[7479]: I0308 00:30:51.367606 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:51.367689 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:51.367689 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:51.367689 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:51.368434 master-0 kubenswrapper[7479]: I0308 00:30:51.367708 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:52.367911 master-0 kubenswrapper[7479]: I0308 00:30:52.367813 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:52.367911 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:52.367911 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:52.367911 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:52.367911 master-0 kubenswrapper[7479]: I0308 00:30:52.367883 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:53.367486 master-0 kubenswrapper[7479]: I0308 00:30:53.367400 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:53.367486 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:53.367486 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:53.367486 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:53.367486 master-0 kubenswrapper[7479]: I0308 00:30:53.367474 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:54.368639 master-0 kubenswrapper[7479]: I0308 00:30:54.368532 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:54.368639 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:54.368639 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:54.368639 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:54.368639 master-0 kubenswrapper[7479]: I0308 00:30:54.368610 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:55.367921 master-0 kubenswrapper[7479]: I0308 00:30:55.367860 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:55.367921 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:55.367921 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:55.367921 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:55.368410 master-0 kubenswrapper[7479]: I0308 00:30:55.367942 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:56.367870 master-0 kubenswrapper[7479]: I0308 00:30:56.367783 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:56.367870 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:56.367870 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:56.367870 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:56.368809 master-0 kubenswrapper[7479]: I0308 00:30:56.367881 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:57.368423 master-0 kubenswrapper[7479]: I0308 00:30:57.368279 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:57.368423 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:57.368423 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:57.368423 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:57.369323 master-0 kubenswrapper[7479]: I0308 00:30:57.368484 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:58.368020 master-0 kubenswrapper[7479]: I0308 00:30:58.367892 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:58.368020 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:58.368020 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:58.368020 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:58.368489 master-0 kubenswrapper[7479]: I0308 00:30:58.368063 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:30:59.143641 master-0 kubenswrapper[7479]: E0308 00:30:59.143538 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:59.145142 master-0 kubenswrapper[7479]: E0308 00:30:59.145070 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:59.146486 master-0 kubenswrapper[7479]: E0308 00:30:59.146448 7479 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 00:30:59.146588 master-0 kubenswrapper[7479]: E0308 00:30:59.146489 7479 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:30:59.368043 master-0 kubenswrapper[7479]: I0308 00:30:59.367952 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:30:59.368043 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:30:59.368043 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:30:59.368043 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:30:59.368460 master-0 kubenswrapper[7479]: I0308 00:30:59.368064 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:00.366228 master-0 kubenswrapper[7479]: I0308 00:31:00.366147 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:31:00.366228 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:31:00.366228 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:31:00.366228 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:31:00.367009 master-0 kubenswrapper[7479]: I0308 00:31:00.366262 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:01.369419 master-0 kubenswrapper[7479]: I0308 00:31:01.369314 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:31:01.369419 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:31:01.369419 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:31:01.369419 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:31:01.370331 master-0 kubenswrapper[7479]: I0308 00:31:01.369468 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:02.369631 master-0 kubenswrapper[7479]: I0308 00:31:02.369518 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:31:02.369631 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:31:02.369631 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:31:02.369631 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:31:02.370425 master-0 kubenswrapper[7479]: I0308 00:31:02.369659 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:03.100761 master-0 kubenswrapper[7479]: I0308 00:31:03.100616 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-85ss7_dd44866c-687b-4dc5-a585-6cd610ebd429/kube-multus-additional-cni-plugins/0.log" Mar 08 00:31:03.100761 master-0 kubenswrapper[7479]: I0308 00:31:03.100725 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:31:03.190310 master-0 kubenswrapper[7479]: I0308 00:31:03.190044 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-85ss7_dd44866c-687b-4dc5-a585-6cd610ebd429/kube-multus-additional-cni-plugins/0.log" Mar 08 00:31:03.190310 master-0 kubenswrapper[7479]: I0308 00:31:03.190147 7479 generic.go:334] "Generic (PLEG): container finished" podID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" exitCode=137 Mar 08 00:31:03.190310 master-0 kubenswrapper[7479]: I0308 00:31:03.190193 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" event={"ID":"dd44866c-687b-4dc5-a585-6cd610ebd429","Type":"ContainerDied","Data":"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7"} Mar 08 00:31:03.190310 master-0 kubenswrapper[7479]: I0308 00:31:03.190269 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" event={"ID":"dd44866c-687b-4dc5-a585-6cd610ebd429","Type":"ContainerDied","Data":"92be0fe5d52bce56af2d4df91e9265657ad1c5f76e4916da41a3e946c24abbfe"} Mar 08 00:31:03.190310 master-0 kubenswrapper[7479]: I0308 00:31:03.190289 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-85ss7" Mar 08 00:31:03.191034 master-0 kubenswrapper[7479]: I0308 00:31:03.190297 7479 scope.go:117] "RemoveContainer" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" Mar 08 00:31:03.214698 master-0 kubenswrapper[7479]: I0308 00:31:03.214661 7479 scope.go:117] "RemoveContainer" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" Mar 08 00:31:03.215291 master-0 kubenswrapper[7479]: E0308 00:31:03.215235 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7\": container with ID starting with 0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7 not found: ID does not exist" containerID="0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7" Mar 08 00:31:03.215291 master-0 kubenswrapper[7479]: I0308 00:31:03.215276 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7"} err="failed to get container status \"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7\": rpc error: code = NotFound desc = could not find container \"0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7\": container with ID starting with 0a044ffc07cd50bd2f121153a8d70cf07a60a5ae2758b09c74d74d0d33264ba7 not found: ID does not exist" Mar 08 00:31:03.232166 master-0 kubenswrapper[7479]: I0308 00:31:03.232087 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmrr\" (UniqueName: \"kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr\") pod \"dd44866c-687b-4dc5-a585-6cd610ebd429\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " Mar 08 00:31:03.232398 master-0 kubenswrapper[7479]: I0308 00:31:03.232321 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready\") pod \"dd44866c-687b-4dc5-a585-6cd610ebd429\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " Mar 08 00:31:03.232398 master-0 kubenswrapper[7479]: I0308 00:31:03.232377 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir\") pod \"dd44866c-687b-4dc5-a585-6cd610ebd429\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " Mar 08 00:31:03.232526 master-0 kubenswrapper[7479]: I0308 00:31:03.232426 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist\") pod \"dd44866c-687b-4dc5-a585-6cd610ebd429\" (UID: \"dd44866c-687b-4dc5-a585-6cd610ebd429\") " Mar 08 00:31:03.232686 master-0 kubenswrapper[7479]: I0308 00:31:03.232598 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "dd44866c-687b-4dc5-a585-6cd610ebd429" (UID: "dd44866c-687b-4dc5-a585-6cd610ebd429"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:03.233096 master-0 kubenswrapper[7479]: I0308 00:31:03.233034 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "dd44866c-687b-4dc5-a585-6cd610ebd429" (UID: "dd44866c-687b-4dc5-a585-6cd610ebd429"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:03.233183 master-0 kubenswrapper[7479]: I0308 00:31:03.233126 7479 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd44866c-687b-4dc5-a585-6cd610ebd429-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:03.233475 master-0 kubenswrapper[7479]: I0308 00:31:03.233434 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready" (OuterVolumeSpecName: "ready") pod "dd44866c-687b-4dc5-a585-6cd610ebd429" (UID: "dd44866c-687b-4dc5-a585-6cd610ebd429"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:03.240721 master-0 kubenswrapper[7479]: I0308 00:31:03.240643 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr" (OuterVolumeSpecName: "kube-api-access-xbmrr") pod "dd44866c-687b-4dc5-a585-6cd610ebd429" (UID: "dd44866c-687b-4dc5-a585-6cd610ebd429"). InnerVolumeSpecName "kube-api-access-xbmrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:03.334755 master-0 kubenswrapper[7479]: I0308 00:31:03.334646 7479 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/dd44866c-687b-4dc5-a585-6cd610ebd429-ready\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:03.334755 master-0 kubenswrapper[7479]: I0308 00:31:03.334702 7479 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd44866c-687b-4dc5-a585-6cd610ebd429-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:03.334755 master-0 kubenswrapper[7479]: I0308 00:31:03.334729 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmrr\" (UniqueName: \"kubernetes.io/projected/dd44866c-687b-4dc5-a585-6cd610ebd429-kube-api-access-xbmrr\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:03.369048 master-0 kubenswrapper[7479]: I0308 00:31:03.368933 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:31:03.369048 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:31:03.369048 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:31:03.369048 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:31:03.369048 master-0 kubenswrapper[7479]: I0308 00:31:03.369034 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:03.562974 master-0 kubenswrapper[7479]: I0308 00:31:03.562872 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-85ss7"] Mar 08 00:31:03.569995 master-0 kubenswrapper[7479]: I0308 00:31:03.569896 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-85ss7"] Mar 08 00:31:03.899790 master-0 kubenswrapper[7479]: I0308 00:31:03.899556 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" path="/var/lib/kubelet/pods/dd44866c-687b-4dc5-a585-6cd610ebd429/volumes" Mar 08 00:31:04.369724 master-0 kubenswrapper[7479]: I0308 00:31:04.369621 7479 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:31:04.369724 master-0 kubenswrapper[7479]: [-]has-synced failed: reason withheld Mar 08 00:31:04.369724 master-0 kubenswrapper[7479]: [+]process-running ok Mar 08 00:31:04.369724 master-0 kubenswrapper[7479]: healthz check failed Mar 08 00:31:04.370418 master-0 kubenswrapper[7479]: I0308 00:31:04.369736 7479 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:31:04.370418 master-0 kubenswrapper[7479]: I0308 00:31:04.369828 7479 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:04.370983 master-0 kubenswrapper[7479]: I0308 00:31:04.370916 7479 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d"} pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" containerMessage="Container router failed startup probe, will be restarted" Mar 08 00:31:04.371113 master-0 kubenswrapper[7479]: I0308 00:31:04.370992 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" containerID="cri-o://1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d" gracePeriod=3600 Mar 08 00:31:10.256565 master-0 kubenswrapper[7479]: I0308 00:31:10.256515 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-jgdmb_d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/multus-admission-controller/0.log" Mar 08 00:31:10.257230 master-0 kubenswrapper[7479]: I0308 00:31:10.256575 7479 generic.go:334] "Generic (PLEG): container finished" podID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerID="c54ec75e7b215135d97163ba8f315624435a019aae1bb5d4becc779b33de3782" exitCode=137 Mar 08 00:31:10.257230 master-0 kubenswrapper[7479]: I0308 00:31:10.256618 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerDied","Data":"c54ec75e7b215135d97163ba8f315624435a019aae1bb5d4becc779b33de3782"} Mar 08 00:31:11.048865 master-0 kubenswrapper[7479]: I0308 00:31:11.048764 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-jgdmb_d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/multus-admission-controller/0.log" Mar 08 00:31:11.049385 master-0 kubenswrapper[7479]: I0308 00:31:11.049362 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:31:11.190498 master-0 kubenswrapper[7479]: I0308 00:31:11.190390 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") pod \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " Mar 08 00:31:11.190762 master-0 kubenswrapper[7479]: I0308 00:31:11.190537 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") pod \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\" (UID: \"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac\") " Mar 08 00:31:11.194050 master-0 kubenswrapper[7479]: I0308 00:31:11.194018 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9" (OuterVolumeSpecName: "kube-api-access-fljc9") pod "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac"). InnerVolumeSpecName "kube-api-access-fljc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:11.196459 master-0 kubenswrapper[7479]: I0308 00:31:11.196380 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" (UID: "d7a0bdcc-92f5-41e6-ab47-ee48a5788bac"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:11.269464 master-0 kubenswrapper[7479]: I0308 00:31:11.269064 7479 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-jgdmb_d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/multus-admission-controller/0.log" Mar 08 00:31:11.269464 master-0 kubenswrapper[7479]: I0308 00:31:11.269168 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" event={"ID":"d7a0bdcc-92f5-41e6-ab47-ee48a5788bac","Type":"ContainerDied","Data":"85f5347214316bafcae54d5f353ea4dd103edcad8e44bd59e26d7ef740d7221a"} Mar 08 00:31:11.269464 master-0 kubenswrapper[7479]: I0308 00:31:11.269291 7479 scope.go:117] "RemoveContainer" containerID="b268ecbf5509c4d57c3cfb99540508683cf8b0aa47cb26e063002abde0b68768" Mar 08 00:31:11.270747 master-0 kubenswrapper[7479]: I0308 00:31:11.269545 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-jgdmb" Mar 08 00:31:11.288416 master-0 kubenswrapper[7479]: I0308 00:31:11.288357 7479 scope.go:117] "RemoveContainer" containerID="c54ec75e7b215135d97163ba8f315624435a019aae1bb5d4becc779b33de3782" Mar 08 00:31:11.293408 master-0 kubenswrapper[7479]: I0308 00:31:11.293360 7479 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:11.293408 master-0 kubenswrapper[7479]: I0308 00:31:11.293406 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fljc9\" (UniqueName: \"kubernetes.io/projected/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac-kube-api-access-fljc9\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:11.314639 master-0 kubenswrapper[7479]: I0308 00:31:11.314556 7479 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:31:11.318393 master-0 kubenswrapper[7479]: I0308 00:31:11.318329 7479 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-jgdmb"] Mar 08 00:31:11.896338 master-0 kubenswrapper[7479]: I0308 00:31:11.896265 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" path="/var/lib/kubelet/pods/d7a0bdcc-92f5-41e6-ab47-ee48a5788bac/volumes" Mar 08 00:31:24.398833 master-0 kubenswrapper[7479]: I0308 00:31:24.398731 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: E0308 00:31:24.399352 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="kube-rbac-proxy" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399386 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="kube-rbac-proxy" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: E0308 00:31:24.399423 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399441 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: E0308 00:31:24.399483 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="multus-admission-controller" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399501 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="multus-admission-controller" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399795 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="multus-admission-controller" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399829 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a0bdcc-92f5-41e6-ab47-ee48a5788bac" containerName="kube-rbac-proxy" Mar 08 00:31:24.399887 master-0 kubenswrapper[7479]: I0308 00:31:24.399864 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd44866c-687b-4dc5-a585-6cd610ebd429" containerName="kube-multus-additional-cni-plugins" Mar 08 00:31:24.401448 master-0 kubenswrapper[7479]: I0308 00:31:24.401409 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.405337 master-0 kubenswrapper[7479]: I0308 00:31:24.405131 7479 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:31:24.405337 master-0 kubenswrapper[7479]: I0308 00:31:24.405167 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rml7" Mar 08 00:31:24.417857 master-0 kubenswrapper[7479]: I0308 00:31:24.417750 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:24.523004 master-0 kubenswrapper[7479]: I0308 00:31:24.522898 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.523331 master-0 kubenswrapper[7479]: I0308 00:31:24.523061 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.523331 master-0 kubenswrapper[7479]: I0308 00:31:24.523103 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.624607 master-0 kubenswrapper[7479]: I0308 00:31:24.624416 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.624607 master-0 kubenswrapper[7479]: I0308 00:31:24.624513 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.624959 master-0 kubenswrapper[7479]: I0308 00:31:24.624681 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.625432 master-0 kubenswrapper[7479]: I0308 00:31:24.625389 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.625524 master-0 kubenswrapper[7479]: I0308 00:31:24.625476 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.648488 master-0 kubenswrapper[7479]: I0308 00:31:24.648419 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:24.746703 master-0 kubenswrapper[7479]: I0308 00:31:24.746629 7479 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rml7" Mar 08 00:31:24.754517 master-0 kubenswrapper[7479]: I0308 00:31:24.754451 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:25.298132 master-0 kubenswrapper[7479]: I0308 00:31:25.298062 7479 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:25.301071 master-0 kubenswrapper[7479]: W0308 00:31:25.300941 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4a829558_a672_4dc5_ae20_69884213482f.slice/crio-388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b WatchSource:0}: Error finding container 388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b: Status 404 returned error can't find the container with id 388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b Mar 08 00:31:25.390603 master-0 kubenswrapper[7479]: I0308 00:31:25.388641 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerStarted","Data":"388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b"} Mar 08 00:31:26.403427 master-0 kubenswrapper[7479]: I0308 00:31:26.403342 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerStarted","Data":"75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17"} Mar 08 00:31:26.423557 master-0 kubenswrapper[7479]: I0308 00:31:26.423424 7479 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=2.423387636 podStartE2EDuration="2.423387636s" podCreationTimestamp="2026-03-08 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:26.418817436 +0000 UTC m=+602.731726363" watchObservedRunningTime="2026-03-08 00:31:26.423387636 +0000 UTC m=+602.736296573" Mar 08 00:31:26.968263 master-0 kubenswrapper[7479]: I0308 00:31:26.968192 7479 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 00:31:26.968514 master-0 kubenswrapper[7479]: I0308 00:31:26.968481 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef" gracePeriod=15 Mar 08 00:31:26.968559 master-0 kubenswrapper[7479]: I0308 00:31:26.968514 7479 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc" gracePeriod=15 Mar 08 00:31:26.969805 master-0 kubenswrapper[7479]: I0308 00:31:26.969774 7479 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:31:26.970150 master-0 kubenswrapper[7479]: E0308 00:31:26.970126 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 00:31:26.970188 master-0 kubenswrapper[7479]: I0308 00:31:26.970158 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 00:31:26.970266 master-0 kubenswrapper[7479]: E0308 00:31:26.970192 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 00:31:26.970266 master-0 kubenswrapper[7479]: I0308 00:31:26.970223 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 00:31:26.970266 master-0 kubenswrapper[7479]: E0308 00:31:26.970244 7479 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 00:31:26.970266 master-0 kubenswrapper[7479]: I0308 00:31:26.970252 7479 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 00:31:26.970417 master-0 kubenswrapper[7479]: I0308 00:31:26.970390 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 00:31:26.970417 master-0 kubenswrapper[7479]: I0308 00:31:26.970410 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 00:31:26.970472 master-0 kubenswrapper[7479]: I0308 00:31:26.970421 7479 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 00:31:26.972275 master-0 kubenswrapper[7479]: I0308 00:31:26.972247 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.021341 master-0 kubenswrapper[7479]: I0308 00:31:27.020685 7479 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:31:27.030465 master-0 kubenswrapper[7479]: I0308 00:31:27.030433 7479 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:31:27.031485 master-0 kubenswrapper[7479]: I0308 00:31:27.031469 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.032627 master-0 kubenswrapper[7479]: I0308 00:31:27.032572 7479 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:27.061590 master-0 kubenswrapper[7479]: E0308 00:31:27.061515 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.090366 master-0 kubenswrapper[7479]: I0308 00:31:27.090321 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.090498 master-0 kubenswrapper[7479]: I0308 00:31:27.090443 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.090498 master-0 kubenswrapper[7479]: I0308 00:31:27.090479 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192085 master-0 kubenswrapper[7479]: I0308 00:31:27.192021 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.192085 master-0 kubenswrapper[7479]: I0308 00:31:27.192085 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.192394 master-0 kubenswrapper[7479]: I0308 00:31:27.192110 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.192394 master-0 kubenswrapper[7479]: I0308 00:31:27.192137 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192394 master-0 kubenswrapper[7479]: I0308 00:31:27.192166 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192394 master-0 kubenswrapper[7479]: I0308 00:31:27.192253 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192394 master-0 kubenswrapper[7479]: I0308 00:31:27.192316 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192795 master-0 kubenswrapper[7479]: I0308 00:31:27.192719 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.192926 master-0 kubenswrapper[7479]: I0308 00:31:27.192884 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.192985 master-0 kubenswrapper[7479]: I0308 00:31:27.192935 7479 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.193034 master-0 kubenswrapper[7479]: I0308 00:31:27.192989 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.294358 master-0 kubenswrapper[7479]: I0308 00:31:27.294295 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294575 master-0 kubenswrapper[7479]: I0308 00:31:27.294433 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294575 master-0 kubenswrapper[7479]: I0308 00:31:27.294521 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294713 master-0 kubenswrapper[7479]: I0308 00:31:27.294687 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294765 master-0 kubenswrapper[7479]: I0308 00:31:27.294752 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294912 master-0 kubenswrapper[7479]: I0308 00:31:27.294849 7479 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294912 master-0 kubenswrapper[7479]: I0308 00:31:27.294857 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294912 master-0 kubenswrapper[7479]: I0308 00:31:27.294867 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.294912 master-0 kubenswrapper[7479]: I0308 00:31:27.294907 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.295084 master-0 kubenswrapper[7479]: I0308 00:31:27.295033 7479 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.309047 master-0 kubenswrapper[7479]: I0308 00:31:27.308978 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:27.328427 master-0 kubenswrapper[7479]: W0308 00:31:27.327953 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8 WatchSource:0}: Error finding container a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8: Status 404 returned error can't find the container with id a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8 Mar 08 00:31:27.331806 master-0 kubenswrapper[7479]: E0308 00:31:27.331631 7479 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189ab656d20379e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:31:27.330597349 +0000 UTC m=+603.643506266,LastTimestamp:2026-03-08 00:31:27.330597349 +0000 UTC m=+603.643506266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:31:27.362486 master-0 kubenswrapper[7479]: I0308 00:31:27.362445 7479 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:27.390799 master-0 kubenswrapper[7479]: W0308 00:31:27.390716 7479 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf417e14665db2ffffa887ce21c9ff0ed.slice/crio-fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8 WatchSource:0}: Error finding container fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8: Status 404 returned error can't find the container with id fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8 Mar 08 00:31:27.413701 master-0 kubenswrapper[7479]: I0308 00:31:27.413646 7479 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc" exitCode=0 Mar 08 00:31:27.415346 master-0 kubenswrapper[7479]: I0308 00:31:27.415306 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8"} Mar 08 00:31:27.416782 master-0 kubenswrapper[7479]: I0308 00:31:27.416743 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8"} Mar 08 00:31:27.419798 master-0 kubenswrapper[7479]: I0308 00:31:27.419756 7479 generic.go:334] "Generic (PLEG): container finished" podID="66915251-1fdd-40f3-a59b-054776b214df" containerID="d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc" exitCode=0 Mar 08 00:31:27.420377 master-0 kubenswrapper[7479]: I0308 00:31:27.420346 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerDied","Data":"d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc"} Mar 08 00:31:27.421194 master-0 kubenswrapper[7479]: I0308 00:31:27.421156 7479 status_manager.go:851] "Failed to get status for pod" podUID="66915251-1fdd-40f3-a59b-054776b214df" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:27.422275 master-0 kubenswrapper[7479]: I0308 00:31:27.422235 7479 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:27.720663 master-0 kubenswrapper[7479]: E0308 00:31:27.720517 7479 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189ab656d20379e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:31:27.330597349 +0000 UTC m=+603.643506266,LastTimestamp:2026-03-08 00:31:27.330597349 +0000 UTC m=+603.643506266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:31:28.428806 master-0 kubenswrapper[7479]: I0308 00:31:28.428653 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"fa423e54fafba3982d7bb2d5466fcee2c23cbdcb2db478a9c800bb36094dd0d1"} Mar 08 00:31:28.429985 master-0 kubenswrapper[7479]: E0308 00:31:28.429931 7479 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:28.430062 master-0 kubenswrapper[7479]: I0308 00:31:28.429980 7479 status_manager.go:851] "Failed to get status for pod" podUID="66915251-1fdd-40f3-a59b-054776b214df" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.430827 master-0 kubenswrapper[7479]: I0308 00:31:28.430717 7479 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.432547 master-0 kubenswrapper[7479]: I0308 00:31:28.431468 7479 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58" exitCode=0 Mar 08 00:31:28.432624 master-0 kubenswrapper[7479]: I0308 00:31:28.431549 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58"} Mar 08 00:31:28.432624 master-0 kubenswrapper[7479]: I0308 00:31:28.432481 7479 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.433263 master-0 kubenswrapper[7479]: I0308 00:31:28.433102 7479 status_manager.go:851] "Failed to get status for pod" podUID="66915251-1fdd-40f3-a59b-054776b214df" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.766812 master-0 kubenswrapper[7479]: I0308 00:31:28.766760 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:28.767892 master-0 kubenswrapper[7479]: I0308 00:31:28.767825 7479 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.768532 master-0 kubenswrapper[7479]: I0308 00:31:28.768458 7479 status_manager.go:851] "Failed to get status for pod" podUID="66915251-1fdd-40f3-a59b-054776b214df" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:31:28.918272 master-0 kubenswrapper[7479]: I0308 00:31:28.918134 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"66915251-1fdd-40f3-a59b-054776b214df\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " Mar 08 00:31:28.918516 master-0 kubenswrapper[7479]: I0308 00:31:28.918291 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"66915251-1fdd-40f3-a59b-054776b214df\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " Mar 08 00:31:28.918516 master-0 kubenswrapper[7479]: I0308 00:31:28.918301 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock" (OuterVolumeSpecName: "var-lock") pod "66915251-1fdd-40f3-a59b-054776b214df" (UID: "66915251-1fdd-40f3-a59b-054776b214df"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:28.918516 master-0 kubenswrapper[7479]: I0308 00:31:28.918416 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"66915251-1fdd-40f3-a59b-054776b214df\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " Mar 08 00:31:28.918516 master-0 kubenswrapper[7479]: I0308 00:31:28.918450 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "66915251-1fdd-40f3-a59b-054776b214df" (UID: "66915251-1fdd-40f3-a59b-054776b214df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:28.918904 master-0 kubenswrapper[7479]: I0308 00:31:28.918843 7479 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:28.918964 master-0 kubenswrapper[7479]: I0308 00:31:28.918945 7479 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:28.921313 master-0 kubenswrapper[7479]: I0308 00:31:28.921267 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "66915251-1fdd-40f3-a59b-054776b214df" (UID: "66915251-1fdd-40f3-a59b-054776b214df"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:29.020480 master-0 kubenswrapper[7479]: I0308 00:31:29.020426 7479 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.352664 master-0 kubenswrapper[7479]: I0308 00:31:29.352616 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:31:29.447655 master-0 kubenswrapper[7479]: I0308 00:31:29.447591 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerDied","Data":"c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef"} Mar 08 00:31:29.447655 master-0 kubenswrapper[7479]: I0308 00:31:29.447634 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:29.447655 master-0 kubenswrapper[7479]: I0308 00:31:29.447646 7479 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef" Mar 08 00:31:29.451406 master-0 kubenswrapper[7479]: I0308 00:31:29.451370 7479 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef" exitCode=0 Mar 08 00:31:29.451491 master-0 kubenswrapper[7479]: I0308 00:31:29.451468 7479 scope.go:117] "RemoveContainer" containerID="2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc" Mar 08 00:31:29.451618 master-0 kubenswrapper[7479]: I0308 00:31:29.451563 7479 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 00:31:29.456779 master-0 kubenswrapper[7479]: I0308 00:31:29.456725 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac"} Mar 08 00:31:29.456850 master-0 kubenswrapper[7479]: I0308 00:31:29.456783 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c"} Mar 08 00:31:29.456850 master-0 kubenswrapper[7479]: I0308 00:31:29.456808 7479 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15"} Mar 08 00:31:29.480875 master-0 kubenswrapper[7479]: I0308 00:31:29.480720 7479 scope.go:117] "RemoveContainer" containerID="a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef" Mar 08 00:31:29.523326 master-0 kubenswrapper[7479]: I0308 00:31:29.522887 7479 scope.go:117] "RemoveContainer" containerID="876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0" Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527646 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527713 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527752 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527777 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527850 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527937 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527961 7479 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.527989 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.528012 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.528029 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.528064 master-0 kubenswrapper[7479]: I0308 00:31:29.528044 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.528460 master-0 kubenswrapper[7479]: I0308 00:31:29.528296 7479 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.528460 master-0 kubenswrapper[7479]: I0308 00:31:29.528317 7479 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.528460 master-0 kubenswrapper[7479]: I0308 00:31:29.528329 7479 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.528460 master-0 kubenswrapper[7479]: I0308 00:31:29.528340 7479 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.528460 master-0 kubenswrapper[7479]: I0308 00:31:29.528354 7479 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.528598 master-0 kubenswrapper[7479]: I0308 00:31:29.528459 7479 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:29.577911 master-0 kubenswrapper[7479]: I0308 00:31:29.577786 7479 scope.go:117] "RemoveContainer" containerID="2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc" Mar 08 00:31:29.578432 master-0 kubenswrapper[7479]: E0308 00:31:29.578329 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc\": container with ID starting with 2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc not found: ID does not exist" containerID="2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc" Mar 08 00:31:29.578432 master-0 kubenswrapper[7479]: I0308 00:31:29.578364 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc"} err="failed to get container status \"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc\": rpc error: code = NotFound desc = could not find container \"2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc\": container with ID starting with 2e9133d4477bb44d83a396e80738171a7ba17de22760faabb67c1d5a203fddcc not found: ID does not exist" Mar 08 00:31:29.578432 master-0 kubenswrapper[7479]: I0308 00:31:29.578386 7479 scope.go:117] "RemoveContainer" containerID="a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef" Mar 08 00:31:29.578744 master-0 kubenswrapper[7479]: E0308 00:31:29.578667 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef\": container with ID starting with a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef not found: ID does not exist" containerID="a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef" Mar 08 00:31:29.578744 master-0 kubenswrapper[7479]: I0308 00:31:29.578685 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef"} err="failed to get container status \"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef\": rpc error: code = NotFound desc = could not find container \"a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef\": container with ID starting with a58a50d55f092d1761d8dfb057eba161b2adfc3672c9c7a2e15f19538478c7ef not found: ID does not exist" Mar 08 00:31:29.578744 master-0 kubenswrapper[7479]: I0308 00:31:29.578701 7479 scope.go:117] "RemoveContainer" containerID="876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0" Mar 08 00:31:29.578971 master-0 kubenswrapper[7479]: E0308 00:31:29.578940 7479 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0\": container with ID starting with 876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0 not found: ID does not exist" containerID="876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0" Mar 08 00:31:29.578971 master-0 kubenswrapper[7479]: I0308 00:31:29.578956 7479 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0"} err="failed to get container status \"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0\": rpc error: code = NotFound desc = could not find container \"876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0\": container with ID starting with 876b4d78a3cb9c09c79646fc0feaa904c1b8712b38b4870f4f9e07763c94bfe0 not found: ID does not exist" Mar 08 00:31:29.629272 master-0 kubenswrapper[7479]: I0308 00:31:29.629218 7479 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:29.894274 master-0 kubenswrapper[7479]: I0308 00:31:29.894106 7479 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 08 00:31:29.894605 master-0 kubenswrapper[7479]: I0308 00:31:29.894530 7479 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 00:31:34.361378 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 00:31:34.394785 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 00:31:34.395296 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 00:31:34.398378 master-0 systemd[1]: kubelet.service: Consumed 1min 20.818s CPU time. Mar 08 00:31:34.443225 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:31:34.650532 master-0 kubenswrapper[23041]: I0308 00:31:34.650468 23041 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653286 23041 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653314 23041 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653323 23041 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653331 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653338 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:31:34.653330 master-0 kubenswrapper[23041]: W0308 00:31:34.653346 23041 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653354 23041 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653363 23041 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653369 23041 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653376 23041 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653382 23041 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653389 23041 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653395 23041 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653411 23041 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653417 23041 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653423 23041 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653428 23041 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653433 23041 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653438 23041 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653443 23041 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653448 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653453 23041 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653458 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653463 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653468 23041 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:31:34.653704 master-0 kubenswrapper[23041]: W0308 00:31:34.653473 23041 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653478 23041 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653483 23041 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653488 23041 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653494 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653499 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653505 23041 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653510 23041 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653515 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653522 23041 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653528 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653533 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653540 23041 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653545 23041 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653550 23041 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653555 23041 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653560 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653566 23041 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653574 23041 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:31:34.655134 master-0 kubenswrapper[23041]: W0308 00:31:34.653580 23041 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653585 23041 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653589 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653594 23041 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653599 23041 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653604 23041 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653609 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653614 23041 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653619 23041 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653624 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653629 23041 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653634 23041 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653639 23041 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653644 23041 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653649 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653654 23041 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653660 23041 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653664 23041 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653669 23041 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653675 23041 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:31:34.656780 master-0 kubenswrapper[23041]: W0308 00:31:34.653680 23041 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653686 23041 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653691 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653696 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653700 23041 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653707 23041 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653715 23041 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: W0308 00:31:34.653721 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653836 23041 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653848 23041 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653859 23041 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653867 23041 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653875 23041 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653881 23041 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653889 23041 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653896 23041 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653902 23041 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653908 23041 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653915 23041 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653921 23041 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653927 23041 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653933 23041 flags.go:64] FLAG: --cgroup-root="" Mar 08 00:31:34.658324 master-0 kubenswrapper[23041]: I0308 00:31:34.653939 23041 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653945 23041 flags.go:64] FLAG: --client-ca-file="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653950 23041 flags.go:64] FLAG: --cloud-config="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653956 23041 flags.go:64] FLAG: --cloud-provider="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653962 23041 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653969 23041 flags.go:64] FLAG: --cluster-domain="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653975 23041 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653981 23041 flags.go:64] FLAG: --config-dir="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653987 23041 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.653993 23041 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654001 23041 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654007 23041 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654013 23041 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654019 23041 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654025 23041 flags.go:64] FLAG: --contention-profiling="false" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654033 23041 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654039 23041 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654045 23041 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654051 23041 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654058 23041 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654064 23041 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654070 23041 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654076 23041 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654082 23041 flags.go:64] FLAG: --enable-server="true" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654087 23041 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 00:31:34.659808 master-0 kubenswrapper[23041]: I0308 00:31:34.654095 23041 flags.go:64] FLAG: --event-burst="100" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654101 23041 flags.go:64] FLAG: --event-qps="50" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654107 23041 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654113 23041 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654119 23041 flags.go:64] FLAG: --eviction-hard="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654126 23041 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654132 23041 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654138 23041 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654144 23041 flags.go:64] FLAG: --eviction-soft="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654150 23041 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654155 23041 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654163 23041 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654168 23041 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654174 23041 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654181 23041 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654186 23041 flags.go:64] FLAG: --feature-gates="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654194 23041 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654225 23041 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654234 23041 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654268 23041 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654277 23041 flags.go:64] FLAG: --healthz-port="10248" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654284 23041 flags.go:64] FLAG: --help="false" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654291 23041 flags.go:64] FLAG: --hostname-override="" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654297 23041 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654303 23041 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 00:31:34.661855 master-0 kubenswrapper[23041]: I0308 00:31:34.654310 23041 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654317 23041 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654323 23041 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654330 23041 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654335 23041 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654341 23041 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654347 23041 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654353 23041 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654359 23041 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654365 23041 flags.go:64] FLAG: --kube-reserved="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654372 23041 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654378 23041 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654384 23041 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654390 23041 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654396 23041 flags.go:64] FLAG: --lock-file="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654401 23041 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654407 23041 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654414 23041 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654423 23041 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654430 23041 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654436 23041 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654441 23041 flags.go:64] FLAG: --logging-format="text" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654447 23041 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654454 23041 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654460 23041 flags.go:64] FLAG: --manifest-url="" Mar 08 00:31:34.663438 master-0 kubenswrapper[23041]: I0308 00:31:34.654465 23041 flags.go:64] FLAG: --manifest-url-header="" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654473 23041 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654479 23041 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654486 23041 flags.go:64] FLAG: --max-pods="110" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654492 23041 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654498 23041 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654503 23041 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654510 23041 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654516 23041 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654522 23041 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654528 23041 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654542 23041 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654556 23041 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654564 23041 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654570 23041 flags.go:64] FLAG: --pod-cidr="" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654575 23041 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654584 23041 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654590 23041 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654596 23041 flags.go:64] FLAG: --pods-per-core="0" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654601 23041 flags.go:64] FLAG: --port="10250" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654607 23041 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654613 23041 flags.go:64] FLAG: --provider-id="" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654619 23041 flags.go:64] FLAG: --qos-reserved="" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654625 23041 flags.go:64] FLAG: --read-only-port="10255" Mar 08 00:31:34.664940 master-0 kubenswrapper[23041]: I0308 00:31:34.654631 23041 flags.go:64] FLAG: --register-node="true" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654637 23041 flags.go:64] FLAG: --register-schedulable="true" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654643 23041 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654652 23041 flags.go:64] FLAG: --registry-burst="10" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654658 23041 flags.go:64] FLAG: --registry-qps="5" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654664 23041 flags.go:64] FLAG: --reserved-cpus="" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654669 23041 flags.go:64] FLAG: --reserved-memory="" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654677 23041 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654683 23041 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654689 23041 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654694 23041 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654700 23041 flags.go:64] FLAG: --runonce="false" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654706 23041 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654711 23041 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654717 23041 flags.go:64] FLAG: --seccomp-default="false" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654723 23041 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654729 23041 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654735 23041 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654741 23041 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654746 23041 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654752 23041 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654758 23041 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654763 23041 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654769 23041 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654776 23041 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 00:31:34.666385 master-0 kubenswrapper[23041]: I0308 00:31:34.654782 23041 flags.go:64] FLAG: --system-cgroups="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654788 23041 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654797 23041 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654809 23041 flags.go:64] FLAG: --tls-cert-file="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654815 23041 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654821 23041 flags.go:64] FLAG: --tls-min-version="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654827 23041 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654833 23041 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654839 23041 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654844 23041 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654850 23041 flags.go:64] FLAG: --v="2" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654858 23041 flags.go:64] FLAG: --version="false" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654866 23041 flags.go:64] FLAG: --vmodule="" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654873 23041 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: I0308 00:31:34.654879 23041 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655045 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655052 23041 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655058 23041 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655063 23041 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655069 23041 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655074 23041 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655081 23041 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655087 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:31:34.667830 master-0 kubenswrapper[23041]: W0308 00:31:34.655093 23041 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655100 23041 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655110 23041 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655116 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655123 23041 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655128 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655134 23041 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655140 23041 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655145 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655152 23041 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655159 23041 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655166 23041 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655175 23041 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655181 23041 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655187 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655193 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655217 23041 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655222 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655227 23041 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:31:34.669481 master-0 kubenswrapper[23041]: W0308 00:31:34.655234 23041 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655240 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655246 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655252 23041 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655257 23041 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655262 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655268 23041 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655273 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655279 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655284 23041 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655289 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655294 23041 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655299 23041 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655304 23041 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655309 23041 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655316 23041 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655323 23041 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655329 23041 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655334 23041 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655340 23041 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:31:34.670638 master-0 kubenswrapper[23041]: W0308 00:31:34.655345 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655351 23041 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655356 23041 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655361 23041 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655366 23041 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655373 23041 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655378 23041 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655383 23041 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655388 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.655455 23041 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656042 23041 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656049 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656055 23041 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656060 23041 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656066 23041 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656071 23041 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656076 23041 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656081 23041 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656087 23041 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656092 23041 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:31:34.671974 master-0 kubenswrapper[23041]: W0308 00:31:34.656097 23041 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.656102 23041 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.656108 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.656117 23041 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.656123 23041 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: I0308 00:31:34.656132 23041 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: I0308 00:31:34.664800 23041 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: I0308 00:31:34.664839 23041 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.664979 23041 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.664994 23041 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665003 23041 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665013 23041 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665021 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665030 23041 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665039 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:31:34.673481 master-0 kubenswrapper[23041]: W0308 00:31:34.665048 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665059 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665068 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665088 23041 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665097 23041 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665105 23041 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665110 23041 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665116 23041 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665123 23041 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665130 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665136 23041 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665141 23041 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665146 23041 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665151 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665156 23041 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665161 23041 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665166 23041 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665171 23041 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665176 23041 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665181 23041 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:31:34.674852 master-0 kubenswrapper[23041]: W0308 00:31:34.665186 23041 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665191 23041 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665196 23041 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665234 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665246 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665251 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665256 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665261 23041 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665266 23041 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665271 23041 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665277 23041 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665281 23041 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665286 23041 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665291 23041 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665296 23041 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665301 23041 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665306 23041 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665311 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665318 23041 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665323 23041 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:31:34.676528 master-0 kubenswrapper[23041]: W0308 00:31:34.665327 23041 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665332 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665337 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665342 23041 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665347 23041 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665352 23041 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665357 23041 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665364 23041 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665371 23041 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665377 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665383 23041 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665390 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665395 23041 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665401 23041 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665406 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665412 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665420 23041 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665427 23041 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:31:34.678396 master-0 kubenswrapper[23041]: W0308 00:31:34.665434 23041 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665439 23041 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665444 23041 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665450 23041 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665455 23041 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665460 23041 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665466 23041 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: I0308 00:31:34.665475 23041 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665638 23041 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665648 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665655 23041 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665662 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665673 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665686 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665694 23041 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:31:34.679607 master-0 kubenswrapper[23041]: W0308 00:31:34.665701 23041 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665709 23041 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665717 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665726 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665732 23041 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665737 23041 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665742 23041 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665747 23041 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665752 23041 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665760 23041 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665766 23041 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665772 23041 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665781 23041 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665787 23041 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665795 23041 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665802 23041 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665809 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665815 23041 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665822 23041 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665830 23041 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:31:34.680870 master-0 kubenswrapper[23041]: W0308 00:31:34.665837 23041 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665843 23041 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665850 23041 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665855 23041 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665860 23041 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665866 23041 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665871 23041 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665877 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665883 23041 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665892 23041 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665901 23041 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665909 23041 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665918 23041 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665926 23041 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665933 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665940 23041 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665946 23041 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665951 23041 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665957 23041 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:31:34.682469 master-0 kubenswrapper[23041]: W0308 00:31:34.665962 23041 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.665969 23041 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.665976 23041 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.665982 23041 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.665989 23041 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.665997 23041 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666004 23041 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666012 23041 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666020 23041 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666029 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666037 23041 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666044 23041 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666051 23041 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666058 23041 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666064 23041 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666071 23041 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666079 23041 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666086 23041 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666093 23041 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666100 23041 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:31:34.683624 master-0 kubenswrapper[23041]: W0308 00:31:34.666109 23041 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: W0308 00:31:34.666118 23041 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: W0308 00:31:34.666126 23041 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: W0308 00:31:34.666133 23041 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: W0308 00:31:34.666140 23041 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: W0308 00:31:34.666148 23041 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.666156 23041 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.666440 23041 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.668556 23041 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.668665 23041 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.668965 23041 server.go:997] "Starting client certificate rotation" Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.668979 23041 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.669183 23041 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 19:27:38.757120786 +0000 UTC Mar 08 00:31:34.684836 master-0 kubenswrapper[23041]: I0308 00:31:34.669330 23041 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h56m4.087795658s for next certificate rotation Mar 08 00:31:34.685618 master-0 kubenswrapper[23041]: I0308 00:31:34.669725 23041 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:31:34.685618 master-0 kubenswrapper[23041]: I0308 00:31:34.671376 23041 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:31:34.685618 master-0 kubenswrapper[23041]: I0308 00:31:34.674004 23041 log.go:25] "Validated CRI v1 runtime API" Mar 08 00:31:34.685618 master-0 kubenswrapper[23041]: I0308 00:31:34.681071 23041 log.go:25] "Validated CRI v1 image API" Mar 08 00:31:34.685618 master-0 kubenswrapper[23041]: I0308 00:31:34.682859 23041 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 00:31:34.694892 master-0 kubenswrapper[23041]: I0308 00:31:34.694806 23041 fs.go:135] Filesystem UUIDs: map[39fc8acc-7a4c-4a2a-a305-ed25849d8805:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 00:31:34.695935 master-0 kubenswrapper[23041]: I0308 00:31:34.694873 23041 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/036c8d5e00b57ec77b752ae2bc46eb3d7ff2904d9ebc488665656ab787ecd5a5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/036c8d5e00b57ec77b752ae2bc46eb3d7ff2904d9ebc488665656ab787ecd5a5/userdata/shm major:0 minor:566 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf/userdata/shm major:0 minor:350 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/11fc2d0ea92ac8231758b019e771de66de17673da31d79a4aab6fc0b796373e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/11fc2d0ea92ac8231758b019e771de66de17673da31d79a4aab6fc0b796373e6/userdata/shm major:0 minor:438 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00/userdata/shm major:0 minor:730 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cc242574263ef7c849076452db10d6f32fa75aeb983a9e0f9150bc85db0911a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cc242574263ef7c849076452db10d6f32fa75aeb983a9e0f9150bc85db0911a/userdata/shm major:0 minor:843 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cddeda960c60a71faf688d26e861f0212c8666ffc3672e89502d43761b93cd2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cddeda960c60a71faf688d26e861f0212c8666ffc3672e89502d43761b93cd2/userdata/shm major:0 minor:799 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f/userdata/shm major:0 minor:1012 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287/userdata/shm major:0 minor:817 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e47d8d2ffbca29135c63c0ec58db9d105e81fa73da896958637e9f0815629eb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e47d8d2ffbca29135c63c0ec58db9d105e81fa73da896958637e9f0815629eb/userdata/shm major:0 minor:570 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/302cab9bf3dbf255daeb9370ab65a4f19b214019a7009e2da9e307530afd287e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/302cab9bf3dbf255daeb9370ab65a4f19b214019a7009e2da9e307530afd287e/userdata/shm major:0 minor:376 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494/userdata/shm major:0 minor:1162 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b/userdata/shm major:0 minor:230 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm major:0 minor:214 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7/userdata/shm major:0 minor:855 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07/userdata/shm major:0 minor:442 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f/userdata/shm major:0 minor:815 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36/userdata/shm major:0 minor:591 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d/userdata/shm major:0 minor:1081 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5993f0db8eb571541ffd45db324c8f25d80729c838e2d7b2910b9b88c3eb3de6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5993f0db8eb571541ffd45db324c8f25d80729c838e2d7b2910b9b88c3eb3de6/userdata/shm major:0 minor:476 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f/userdata/shm major:0 minor:973 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88/userdata/shm major:0 minor:65 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183/userdata/shm major:0 minor:612 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008/userdata/shm major:0 minor:782 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3/userdata/shm major:0 minor:955 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865/userdata/shm major:0 minor:384 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d/userdata/shm major:0 minor:845 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/874da80b3858b9b5a8a2258c3b83f19f5f0c80010ec82d07a7dc18d61c4292fa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/874da80b3858b9b5a8a2258c3b83f19f5f0c80010ec82d07a7dc18d61c4292fa/userdata/shm major:0 minor:576 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68/userdata/shm major:0 minor:1084 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced/userdata/shm major:0 minor:1017 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm major:0 minor:260 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ff474153830a652e4ddb7aadf249d8bcfad8aa4e41fc72213e841bb0817ffeb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ff474153830a652e4ddb7aadf249d8bcfad8aa4e41fc72213e841bb0817ffeb/userdata/shm major:0 minor:786 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9da3ea5c4393051eef91cb7af969405949bc3c6b97f5782d6bc10af29a80c30d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9da3ea5c4393051eef91cb7af969405949bc3c6b97f5782d6bc10af29a80c30d/userdata/shm major:0 minor:437 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0d7955b7085045599d0a7ea45ff20f907bc225ec27c46ed3dcc33b59207b912/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0d7955b7085045599d0a7ea45ff20f907bc225ec27c46ed3dcc33b59207b912/userdata/shm major:0 minor:631 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8/userdata/shm major:0 minor:89 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a2af0127ad556015336cd256817276cc9d6a8a08dbbf295a1bf7821d7309d19c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a2af0127ad556015336cd256817276cc9d6a8a08dbbf295a1bf7821d7309d19c/userdata/shm major:0 minor:573 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a68be094b9128e17cfcb273f66f3867ebf81ebb395668f57f098ee489c8a0035/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a68be094b9128e17cfcb273f66f3867ebf81ebb395668f57f098ee489c8a0035/userdata/shm major:0 minor:728 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aaafa12a616f7369af11bbeebe18962338e3a83e1b72c0a692864a7176225e0a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aaafa12a616f7369af11bbeebe18962338e3a83e1b72c0a692864a7176225e0a/userdata/shm major:0 minor:391 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm major:0 minor:92 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bb8dfd749824585a5971cc6ceb0409c06052a233c71d6156a9b5d20725022dcf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bb8dfd749824585a5971cc6ceb0409c06052a233c71d6156a9b5d20725022dcf/userdata/shm major:0 minor:477 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c0511cfa10b44562c51d17ac29eccf8315f318be9fcd77f37c978f1bbeeb8000/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c0511cfa10b44562c51d17ac29eccf8315f318be9fcd77f37c978f1bbeeb8000/userdata/shm major:0 minor:568 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000/userdata/shm major:0 minor:1088 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4/userdata/shm major:0 minor:810 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm major:0 minor:270 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cd06e32b994481471c1008a22765ea8fb7d4c0eac4c1085f974725068e466db7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cd06e32b994481471c1008a22765ea8fb7d4c0eac4c1085f974725068e466db7/userdata/shm major:0 minor:567 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ce67cd1e37e90c976b5eb1d98a8adbdd3c36380a0d4d75edb38584db8eeda1f5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ce67cd1e37e90c976b5eb1d98a8adbdd3c36380a0d4d75edb38584db8eeda1f5/userdata/shm major:0 minor:545 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dc6431dd72c27cd0cc50f525ef4684b1138ca71254e30382dcc7425a8c604797/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dc6431dd72c27cd0cc50f525ef4684b1138ca71254e30382dcc7425a8c604797/userdata/shm major:0 minor:765 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97/userdata/shm major:0 minor:734 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29/userdata/shm major:0 minor:821 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e21ecaa295b51fd30f3e30feccdaaffb5d26d81a05305635fb9f903bb9b8a90e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e21ecaa295b51fd30f3e30feccdaaffb5d26d81a05305635fb9f903bb9b8a90e/userdata/shm major:0 minor:475 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6/userdata/shm major:0 minor:1140 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f0660a52e90ffa7a2326892a3e2cda1d66d0d4aba0e60527ee906109c288f588/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f0660a52e90ffa7a2326892a3e2cda1d66d0d4aba0e60527ee906109c288f588/userdata/shm major:0 minor:441 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f3cab32904f1f3dc9eae1dc7b47ec8d51b63661baeb9517ad66b59248d52dfef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f3cab32904f1f3dc9eae1dc7b47ec8d51b63661baeb9517ad66b59248d52dfef/userdata/shm major:0 minor:990 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a/userdata/shm major:0 minor:480 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f661c7de8e4aded6ffb76b6f77c2ac0e5ed6e7e0e7ebfcafe40f9c953ec5ee63/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f661c7de8e4aded6ffb76b6f77c2ac0e5ed6e7e0e7ebfcafe40f9c953ec5ee63/userdata/shm major:0 minor:541 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d/userdata/shm major:0 minor:592 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f860ea80aed55d2d8aefcd014e94c8e07b481ea1bac54429f957dafad3d193dc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f860ea80aed55d2d8aefcd014e94c8e07b481ea1bac54429f957dafad3d193dc/userdata/shm major:0 minor:1014 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd2c01cdd304d39e575ca69d83c243fee0060006da5d42ff4d10f498f54d4b60/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd2c01cdd304d39e575ca69d83c243fee0060006da5d42ff4d10f498f54d4b60/userdata/shm major:0 minor:338 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8/userdata/shm major:0 minor:679 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070/userdata/shm major:0 minor:473 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~projected/kube-api-access-b66xq:{mountpoint:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~projected/kube-api-access-b66xq major:0 minor:1139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1133 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1137 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc:{mountpoint:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~secret/metrics-tls major:0 minor:427 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d0cb126-341c-4215-ad2e-a008193cc0b5/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/0d0cb126-341c-4215-ad2e-a008193cc0b5/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1003 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn:{mountpoint:/var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn major:0 minor:274 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~projected/kube-api-access-l22cn:{mountpoint:/var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~projected/kube-api-access-l22cn major:0 minor:544 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:522 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~projected/kube-api-access-6qshd:{mountpoint:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~projected/kube-api-access-6qshd major:0 minor:469 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/encryption-config major:0 minor:431 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/etcd-client major:0 minor:468 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/serving-cert major:0 minor:398 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4:{mountpoint:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:436 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e/volumes/kubernetes.io~projected/kube-api-access-gkh52:{mountpoint:/var/lib/kubelet/pods/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e/volumes/kubernetes.io~projected/kube-api-access-gkh52 major:0 minor:429 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/ca-certs major:0 minor:535 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/kube-api-access-gh2h6:{mountpoint:/var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/kube-api-access-gh2h6 major:0 minor:542 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~projected/kube-api-access-txt48:{mountpoint:/var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~projected/kube-api-access-txt48 major:0 minor:1161 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1157 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f63cb2f-779f-4fde-bf92-cf0414844a77/volumes/kubernetes.io~projected/kube-api-access-wh9cz:{mountpoint:/var/lib/kubelet/pods/1f63cb2f-779f-4fde-bf92-cf0414844a77/volumes/kubernetes.io~projected/kube-api-access-wh9cz major:0 minor:328 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~projected/kube-api-access-wllt8:{mountpoint:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~projected/kube-api-access-wllt8 major:0 minor:1078 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1075 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1074 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2ac55f03-dd6f-4ead-bacc-c69aeca146dc/volumes/kubernetes.io~projected/kube-api-access-8d4xz:{mountpoint:/var/lib/kubelet/pods/2ac55f03-dd6f-4ead-bacc-c69aeca146dc/volumes/kubernetes.io~projected/kube-api-access-8d4xz major:0 minor:385 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh:{mountpoint:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~projected/kube-api-access-h65c2:{mountpoint:/var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~projected/kube-api-access-h65c2 major:0 minor:842 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~secret/proxy-tls major:0 minor:841 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~projected/kube-api-access-vb4n9:{mountpoint:/var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~projected/kube-api-access-vb4n9 major:0 minor:758 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:81 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml:{mountpoint:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~projected/kube-api-access-qv5kd:{mountpoint:/var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~projected/kube-api-access-qv5kd major:0 minor:806 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~secret/cert major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9:{mountpoint:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:540 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/tmp major:0 minor:554 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~projected/kube-api-access-mcqn9:{mountpoint:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~projected/kube-api-access-mcqn9 major:0 minor:555 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~projected/kube-api-access-vj8sl:{mountpoint:/var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~projected/kube-api-access-vj8sl major:0 minor:768 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:767 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4a829558-a672-4dc5-ae20-69884213482f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/4a829558-a672-4dc5-ae20-69884213482f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:109 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9:{mountpoint:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~secret/srv-cert major:0 minor:562 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64:{mountpoint:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~secret/metrics-tls major:0 minor:433 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~projected/kube-api-access-bcl7q:{mountpoint:/var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~projected/kube-api-access-bcl7q major:0 minor:383 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~secret/signing-key major:0 minor:402 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~projected/kube-api-access-crfg9:{mountpoint:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~projected/kube-api-access-crfg9 major:0 minor:464 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/encryption-config major:0 minor:459 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/etcd-client major:0 minor:458 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/serving-cert major:0 minor:457 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55c8d406-5448-4056-ab3c-c8399217c024/volumes/kubernetes.io~projected/kube-api-access-nljwf:{mountpoint:/var/lib/kubelet/pods/55c8d406-5448-4056-ab3c-c8399217c024/volumes/kubernetes.io~projected/kube-api-access-nljwf major:0 minor:798 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56e11e7e-6946-4e11-bce9-e91a721fe4a7/volumes/kubernetes.io~projected/kube-api-access-kmxq9:{mountpoint:/var/lib/kubelet/pods/56e11e7e-6946-4e11-bce9-e91a721fe4a7/volumes/kubernetes.io~projected/kube-api-access-kmxq9 major:0 minor:750 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc:{mountpoint:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~projected/kube-api-access major:0 minor:424 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~secret/serving-cert major:0 minor:471 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~projected/kube-api-access-stxt7:{mountpoint:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~projected/kube-api-access-stxt7 major:0 minor:1077 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1070 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1076 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9:{mountpoint:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:531 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~projected/kube-api-access-8v5hl:{mountpoint:/var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~projected/kube-api-access-8v5hl major:0 minor:759 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~secret/serving-cert major:0 minor:735 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr:{mountpoint:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~projected/kube-api-access-x9fv4:{mountpoint:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~projected/kube-api-access-x9fv4 major:0 minor:1006 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/default-certificate major:0 minor:1002 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1005 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/stats-auth major:0 minor:1004 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt:{mountpoint:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:561 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~projected/kube-api-access-kqjt7:{mountpoint:/var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~projected/kube-api-access-kqjt7 major:0 minor:729 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~secret/serving-cert major:0 minor:725 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~projected/kube-api-access-cw6xw:{mountpoint:/var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~projected/kube-api-access-cw6xw major:0 minor:995 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:994 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/786e30f1-d30a-43e1-85cb-d8ea1495422e/volumes/kubernetes.io~projected/kube-api-access-dvglb:{mountpoint:/var/lib/kubelet/pods/786e30f1-d30a-43e1-85cb-d8ea1495422e/volumes/kubernetes.io~projected/kube-api-access-dvglb major:0 minor:1007 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x:{mountpoint:/var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9:{mountpoint:/var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9 major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz:{mountpoint:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~secret/metrics-certs major:0 minor:563 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~projected/kube-api-access-ht8zb:{mountpoint:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~projected/kube-api-access-ht8zb major:0 minor:809 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cert ma Mar 08 00:31:34.696732 master-0 kubenswrapper[23041]: jor:0 minor:808 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:807 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl:{mountpoint:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:559 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~projected/kube-api-access-dz874:{mountpoint:/var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~projected/kube-api-access-dz874 major:0 minor:836 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~secret/proxy-tls major:0 minor:835 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b/volumes/kubernetes.io~projected/kube-api-access-ll99v:{mountpoint:/var/lib/kubelet/pods/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b/volumes/kubernetes.io~projected/kube-api-access-ll99v major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~projected/kube-api-access-ncncc:{mountpoint:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~projected/kube-api-access-ncncc major:0 minor:793 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/certs major:0 minor:784 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb:{mountpoint:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~projected/kube-api-access-qznbf:{mountpoint:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~projected/kube-api-access-qznbf major:0 minor:1083 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1080 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1079 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m:{mountpoint:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2548aca-4a9d-4670-a60a-0d6361d1c441/volumes/kubernetes.io~projected/kube-api-access-dvvvn:{mountpoint:/var/lib/kubelet/pods/b2548aca-4a9d-4670-a60a-0d6361d1c441/volumes/kubernetes.io~projected/kube-api-access-dvvvn major:0 minor:812 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5:{mountpoint:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5 major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~secret/srv-cert major:0 minor:560 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn:{mountpoint:/var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj:{mountpoint:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~projected/kube-api-access-zvhx4:{mountpoint:/var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~projected/kube-api-access-zvhx4 major:0 minor:854 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:840 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~projected/kube-api-access-4t8np:{mountpoint:/var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~projected/kube-api-access-4t8np major:0 minor:727 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~secret/serving-cert major:0 minor:724 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~projected/kube-api-access-2ggmz:{mountpoint:/var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~projected/kube-api-access-2ggmz major:0 minor:972 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~secret/proxy-tls major:0 minor:967 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/ca-certs major:0 minor:543 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/kube-api-access-rt9pm:{mountpoint:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/kube-api-access-rt9pm major:0 minor:611 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:610 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~projected/kube-api-access-pt6w4:{mountpoint:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~projected/kube-api-access-pt6w4 major:0 minor:837 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:726 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/webhook-cert major:0 minor:420 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5:{mountpoint:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5 major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~projected/kube-api-access-t99pg:{mountpoint:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~projected/kube-api-access-t99pg major:0 minor:69 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:56 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~projected/kube-api-access-9bmgb:{mountpoint:/var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~projected/kube-api-access-9bmgb major:0 minor:565 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~secret/metrics-tls major:0 minor:583 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~projected/kube-api-access-smnrc:{mountpoint:/var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~projected/kube-api-access-smnrc major:0 minor:764 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:763 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~projected/kube-api-access-zgqmb:{mountpoint:/var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~projected/kube-api-access-zgqmb major:0 minor:805 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:776 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e97435ee-522e-427d-9efc-40bc3d2b0d02/volumes/kubernetes.io~projected/kube-api-access-bv9fl:{mountpoint:/var/lib/kubelet/pods/e97435ee-522e-427d-9efc-40bc3d2b0d02/volumes/kubernetes.io~projected/kube-api-access-bv9fl major:0 minor:375 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz:{mountpoint:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz major:0 minor:91 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf:{mountpoint:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} overlay_0-100:{mountpoint:/var/lib/containers/storage/overlay/1d6841e836f2ca030fbe4707f0c45b8976755ac6da9c8ab6125ae7b4f50b5571/merged major:0 minor:100 fsType:overlay blockSize:0} overlay_0-1001:{mountpoint:/var/lib/containers/storage/overlay/1b165ff378334410d1477b0a9ecfc6f6ffc02757441260a6f23438d8027f506a/merged major:0 minor:1001 fsType:overlay blockSize:0} overlay_0-1016:{mountpoint:/var/lib/containers/storage/overlay/85ea3910c6e2e60d9c6525a4287b5bc6b094a170c918e102fd1d66f858f0efaa/merged major:0 minor:1016 fsType:overlay blockSize:0} overlay_0-1020:{mountpoint:/var/lib/containers/storage/overlay/7ebc3ad9b03bd13de5649e10f32b00da5163491002524b8e34608f5887ec468f/merged major:0 minor:1020 fsType:overlay blockSize:0} overlay_0-1025:{mountpoint:/var/lib/containers/storage/overlay/73177c91bd657dda75f767d2cedd87871d148635731cc9b0483eb2d69b73cdaf/merged major:0 minor:1025 fsType:overlay blockSize:0} overlay_0-1027:{mountpoint:/var/lib/containers/storage/overlay/8e19d9339a9e06a5bedd63bf8c341cb2ca8b87190875e237b99cb577e3c8d13b/merged major:0 minor:1027 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/40c14ef85373f3e6ce41db46b340eabdd8632d207053a144365fea1467e9f497/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-1037:{mountpoint:/var/lib/containers/storage/overlay/39d3abba9f91d5223f9d2ba8c95ef4bc1072c58254a266b3f1bdff2e1bf17e36/merged major:0 minor:1037 fsType:overlay blockSize:0} overlay_0-1039:{mountpoint:/var/lib/containers/storage/overlay/e781e4c3cdcf4431601ae3563cb49c06fa38ce6d2f3607047058440e33c57491/merged major:0 minor:1039 fsType:overlay blockSize:0} overlay_0-1047:{mountpoint:/var/lib/containers/storage/overlay/6bc1bf607b2b1466368b848d9925df7bf54682261e43231b6bc477a064c408c7/merged major:0 minor:1047 fsType:overlay blockSize:0} overlay_0-1049:{mountpoint:/var/lib/containers/storage/overlay/5a16616aad590ee95207045a381c9c2775413057e648ca02c53b0e1a0588653a/merged major:0 minor:1049 fsType:overlay blockSize:0} overlay_0-1062:{mountpoint:/var/lib/containers/storage/overlay/37cf176ecd8035d1e91b6f54a553209cd51e2d5384465a205b2c0a823c93a0e3/merged major:0 minor:1062 fsType:overlay blockSize:0} overlay_0-1064:{mountpoint:/var/lib/containers/storage/overlay/a812e7cec33acfab9e424b0fa12a1817e20f77cafe9189edc9ba5f04728b51c1/merged major:0 minor:1064 fsType:overlay blockSize:0} overlay_0-1086:{mountpoint:/var/lib/containers/storage/overlay/e433551c81e1e3e536b5b533f767f52dcb1039405398bde9989f279820e418a6/merged major:0 minor:1086 fsType:overlay blockSize:0} overlay_0-1090:{mountpoint:/var/lib/containers/storage/overlay/85b882d9c2a8ee7d28c53d934ddf31517ab03022886c29292415203d9f913a5a/merged major:0 minor:1090 fsType:overlay blockSize:0} overlay_0-1092:{mountpoint:/var/lib/containers/storage/overlay/a3ccbb39b7f68221c4888efa78ea3f1168930650ee4e317bc56d1a35d0501ac5/merged major:0 minor:1092 fsType:overlay blockSize:0} overlay_0-1094:{mountpoint:/var/lib/containers/storage/overlay/48012d31a392baac4af1e62f468085eeb7b449c901b64ab24d3913f51d95283f/merged major:0 minor:1094 fsType:overlay blockSize:0} overlay_0-1100:{mountpoint:/var/lib/containers/storage/overlay/a77d5e3bc1e7896e8b28c82810b1f62825c6627b2436735e41b168287b5c69b5/merged major:0 minor:1100 fsType:overlay blockSize:0} overlay_0-1102:{mountpoint:/var/lib/containers/storage/overlay/09b670b4cc67e7c5b7e0944fda8edba4f54636d2208866cd4465d42fe29a2689/merged major:0 minor:1102 fsType:overlay blockSize:0} overlay_0-1107:{mountpoint:/var/lib/containers/storage/overlay/ba9f5b4211ad51dc56f42b3352efe4b6b129dd525cf24ed2176dfdd9992eb1ca/merged major:0 minor:1107 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/4277375bbe0ce61b7367dadd18e9b0d805798ece18a9e9bba7862f63a4f3b441/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1112:{mountpoint:/var/lib/containers/storage/overlay/3084b38e5060e6706d37675b129879527ccc0e13e9ef82db1397d695db4bae4e/merged major:0 minor:1112 fsType:overlay blockSize:0} overlay_0-1117:{mountpoint:/var/lib/containers/storage/overlay/de4aef2bb7f5f7b82a4da6dd95750e9cbf3af93ee628adb1a5a85e5b0615913c/merged major:0 minor:1117 fsType:overlay blockSize:0} overlay_0-1119:{mountpoint:/var/lib/containers/storage/overlay/2ea71dbfcd9fd486aad7fcb287bb0ab027819ca2b7dacbb9ad0f2f79d800f623/merged major:0 minor:1119 fsType:overlay blockSize:0} overlay_0-1128:{mountpoint:/var/lib/containers/storage/overlay/d275be61e3e29202c54f6c8e776eb41c367e8127fca45529584609fe33b1b909/merged major:0 minor:1128 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/84b0224236097ab13e450896930058af3171bf486ae1aed18c97b720b33287d5/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-1142:{mountpoint:/var/lib/containers/storage/overlay/063317728e1fead798ff1b45c6eaa73ca023b87faaf84c54f937b3da8bd9e0a9/merged major:0 minor:1142 fsType:overlay blockSize:0} overlay_0-1144:{mountpoint:/var/lib/containers/storage/overlay/a8a1586c485aae143625668ad67aa730e82925010ab3e33627e7428990a10597/merged major:0 minor:1144 fsType:overlay blockSize:0} overlay_0-1152:{mountpoint:/var/lib/containers/storage/overlay/2067579f84a7219b568238793dfeb041c55059875d7f6bda14d4b3d5340b6eff/merged major:0 minor:1152 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/09f9b60cd0b8984fa75907675e582f2eca823dbdecaf45c1bf31084c969444ec/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/6caac3bac4436bbd19a3fbc3a9e0414eba6b0e83129413cf63f2bcbd3a0b0c12/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-1164:{mountpoint:/var/lib/containers/storage/overlay/2835949b70206c95141e6aa476c9bfc6e0b5107b727703a4d38d8e82c1f1397e/merged major:0 minor:1164 fsType:overlay blockSize:0} overlay_0-1166:{mountpoint:/var/lib/containers/storage/overlay/bbbc5049f602ae9927e32f0b4e9324424cc0f401e0fd778a0b79e9db29866412/merged major:0 minor:1166 fsType:overlay blockSize:0} overlay_0-1168:{mountpoint:/var/lib/containers/storage/overlay/26367880a246dbc47ddee13f7ff9fc35a0c17054f69206dec57920ebfdae619a/merged major:0 minor:1168 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/7f20b97a50fe58de9ca60734ae0f68a9d31077e4cdad0e3f8fd5762d0d7acd59/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/ee5ac5b49b2fc01de6e2817953974019ef102832ee570cf6a8907eaac81a7257/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/a9606da570bf7bc470010daacf00c94358ed6a5c0a3dabc879817fe291b847ae/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/2fcdc0fba11171fc0a45e0fc3205a3c6480a96ae9c0b723b0e8276e1fdaa3550/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/493d1ebd3aa61321335aecdb8f7621ba334eca73215c1b476883adf33c90ca3f/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f351e59d29694c1c618aa8fabfc5ca5dcdd2abf60e9cbc42f8b30e186ab1aa4f/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/ccb02884071582e9882e6276b7dede458b855adeb8a868b9f96c4c25b89dda64/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/76a75e52e32a86bc6b0cbe99f5756ea92bfcdc810601d4d2f2df8563231b1d0a/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/6ca277f9f39154392193b18ea0e2f89bd8df5cf180b9c411a3c98050a99f4ab4/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/4e80de1fe9faea9dbba1e13f19a260fd6f23d9e795cd4802047f7a954b12dde0/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-162:{mountpoint:/var/lib/containers/storage/overlay/0f51ebe2176b0a5c8502eb5ccefc6225e5d1fb90bfb252d305760413b2d830df/merged major:0 minor:162 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/1eaa7a0ac67d53479f098279e439d1e633f088d68966ad30c6611190e02057de/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-171:{mountpoint:/var/lib/containers/storage/overlay/ef4b7fb5200e967d75850d9c3d4a66bcc6cf719e2a7baebcc14059d766eceed8/merged major:0 minor:171 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/336b4e6785d0623172d86118f5404257b04a82615d08723fe286f1ac97a9069b/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/82c150753f44271a108ef61377e9af687118c4096ecd307c4461ec71abf50940/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/009a95d07e5675e796b37311f4e796f76ab20edd6ddead69ce2b5b6be06caba0/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/8dcbe071fe2e30ad6d3b67d17d46f78d6dca145a1a04aeae6d8bb8a229095b49/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/c724356aca1e8f5f9da95f0308e28e74519a2ddae2c7d654e2272b7233ab3a5b/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/3fafcdb8a90b24c3ecf9e183b62d1e882b4a782547351a2bb13ae53b5948d5b1/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/e4e97cf8e78293e4e1517301c44179463e10ad30a2615838979215e46ca4ad31/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-227:{mountpoint:/var/lib/containers/storage/overlay/8c81f9d4cc7a8b8b9d0ce5e446ba4ef8f5ead28044ad67f0c0487ebcc831a7d0/merged major:0 minor:227 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/590371c6acd7d9b35e6940c1fa3e224ea8b1f2415b86492f8a0c210bda289471/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/78fdfaad447a4c8cc0c1ab588df9cba56b432bf7f456c61db1904eee8cf7f5cf/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/898e9d7a038ebbdd876055c07b3313cdd824a48f79e70c24729b49592fe24cb9/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/0e9df9befa6dcc6396f6e84e5453bc78694609435cc9b5eb0bb5704a9aa677c0/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/96b17ea955812c320c4082272f8615ba756ef2a8a8613b82a765b99a568e4f33/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/848976b7e3bce10fd05acc59b0503244207440508a5fa13b758d58ebe94d7769/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/f0f93abd77eaed83efd65183a638b4dd15c4482efd8f9622c17ce1c8be9e3b27/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/e79c5ea5da9f2aafa3183e940ba72e1dd5ab17a148fc1429a23441784411b53c/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/ef13e7cfc28e06cd979113de900ddabc3d40fbbba7bb273ede1c73e6c2274db4/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/6ac13c9cbbec1989d5b7fed65fb097ef697af8ace91a1156831ac485a42b80fa/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/c535dc0b54f1b137c93969c2de24f6b3c2fa73b5094dc4442b7994052d9fb86c/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/9789645af9097533eece7d492c39db240e9660b27d08c1bfcb97828f272b7f21/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/a73b0f81296daf71702ff7a86f4afa7ff55025c44ebbefde0aaedb7153254448/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/1a50c05ea3c552bcd1cbc8d4f887d664275d7c7df645b51b947bf5cf02067ebd/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/4f4807ed4ae9f8ca3c168913db500043bf19ed214a857faa76ac1f6ab7f8b4b0/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/436ca6144b12c0a620aa6323fb79a8dc4238a10fe6d627c1508defeddacb9cd1/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-327:{mountpoint:/var/lib/containers/storage/overlay/636f371dfb49b0a76c03ea0edb4962842b8e96ef6770c4ae894b90a9b9f59c95/merged major:0 minor:327 fsType:overlay blockSize:0} overlay_0-330:{mountpoint:/var/lib/containers/storage/overlay/9ef745e0143a4e949192ce596dd0584563bc2f0fa509691f265eddf44ffb237e/merged major:0 minor:330 fsType:overlay blockSize:0} overlay_0-342:{mountpoint:/var/lib/containers/storage/overlay/e8b5ef73b11c57554b59164c80e59d8effc5776bfbd87ebd6c707d78d926c0d7/merged major:0 minor:342 fsType:overlay blockSize:0} overlay_0-343:{mountpoint:/var/lib/containers/storage/overlay/608f0ee615b3b49fa941ae74b27a4c937132c6e7ffa735385a384ff1fc7d2827/merged major:0 minor:343 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/fd115db9ef6ce44590152230feb8480af46a9e50bdca0935c6318819d6673f2c/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-357:{mountpoint:/var/lib/containers/storage/overlay/afe7f3fe9514b53e41cc430758bc893971760daeb611ee74f5f333bab6d89e68/merged major:0 minor:357 fsType:overlay blockSize:0} overlay_0-378:{mountpoint:/var/lib/containers/storage/overlay/b576a0286800361eb315521a37d131d5edc6a1d5c11894a29a0ec70a9725ed5c/merged major:0 minor:378 fsType:overlay blockSize:0} overlay_0-388:{mountpoint:/var/lib/containers/storage/overlay/c71210e1824e5167fd93dee3673297db96ed7e630fb19d0b9cdd1bf8729a9e54/merged major:0 minor:388 fsType:overlay blockSize:0} overlay_0-394:{mountpoint:/var/lib/containers/storage/overlay/c4d6a2fd3a4acb9ba8316fdc550bfe29f1ba0b1607b42f735ca756b5f5187840/merged major:0 minor:394 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/4ff2ae9a640888a1d57e302f469dd43dd1b3ec87c8af8e0aa4b98de0724236f1/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-412:{mountpoint:/var/lib/containers/storage/overlay/236c7e40f8243a934ad2c21edcb390bda0751a27dcf7ca8a36f3e2983fc18802/merged major:0 minor:412 fsType:overlay blockSize:0} overlay_0-414:{mountpoint:/var/lib/containers/storage/overlay/e6a74e2e162a98e28af21880511ce62906cd54da5845f7bc03ceb358ba65a69d/merged major:0 minor:414 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/e5283acfd6026eee6d977c7e479d61f1e427b9c603e82f622086e3669d24886b/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-426:{mountpoint:/var/lib/containers/storage/overlay/a3062ca96a6e772be6724dd714ecbb3fa66e8cb5f9fb432487cd5e98e072e466/merged major:0 minor:426 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/f797e90aaf92e52afe82bbd0c41eb56461ff17b12299db05412b540544e632b9/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-443:{mountpoint:/var/lib/containers/storage/overlay/bb9b27b36ea4e9a397fcedf61ce7194f6e13f5e7db63bc843ea3eb6021399216/merged major:0 minor:443 fsType:overlay blockSize:0} overlay_0-447:{mountpoint:/var/lib/containers/storage/overlay/5cb30953344bc4cc849ac4b8f725708e0b9aaddf4486644abe748b01f884c5f8/merged major:0 minor:447 fsType:overlay blockSize:0} overlay_0-449:{mountpoint:/var/lib/containers/storage/overlay/9099db5eaa7028414911338bce8e04390eccb9d262249dad226e38a49dd969db/merged major:0 minor:449 fsType:overlay blockSize:0} overlay_0-451:{mountpoint:/var/lib/containers/storage/overlay/78d71c71a15cc908a9ca30c99297d05c654f11850cf6f40d17e1f5e4f9a0d4e0/merged major:0 minor:451 fsType:overlay blockSize:0} overlay_0-453:{mountpoint:/var/lib/containers/storage/overlay/37fe4cbcb634e51681287c5353f92c68b1f64837c89f05012e53ed8fd785712d/merged major:0 minor:453 fsType:overlay blockSize:0} overlay_0-455:{mountpoint:/var/lib/containers/storage/overlay/8e19d9388e0fcbfe7347b63ee6e4cd6288e99ef1c3697a150d0b8b505acddb58/merged major:0 minor:455 fsType:overlay blockSize:0} overlay_0-466:{mountpoint:/var/lib/containers/storage/overlay/3a731e24984eb036ec9d3c1bbd79f49397abd7cdefc262ed9947923c899cfbaa/merged major:0 minor:466 fsType:overlay blockSize:0} overlay_0-474:{mountpoint:/var/lib/containers/storage/overlay/a628051b78dcb3fff01fa93f77da0e616402b61e557d88eade7993958f26656a/merged major:0 minor:474 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/c2309543f96767ce7bc1e666d9fd61d8e86da29148099eec0e36411ef650a283/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-483:{mountpoint:/var/lib/containers/storage/overlay/f6e25545bcd67cb867eaebe47b27da53aa36b1aa20975f63b4168d1f79104672/merged major:0 minor:483 fsType:overlay blockSize:0} overlay_0-487:{mountpoint:/var/lib/containers/storage/overlay/361288b49c228546b90e944749855ff4a848f35b5798d2bf52711d555724f025/merged major:0 minor:487 fsType:overlay blockSize:0} overlay_0-489:{mountpoint:/var/lib/containers/storage/overlay/ee57779fcbe9fdaae7459ef68b2f32fc02bd9f096bf8511d1af3aa0f63114eef/merged major:0 minor:489 fsType:overlay blockSize:0} overlay_0-49:{mountpoint:/var/lib/containers/storage/overlay/f7740c38df9e5a16e18abc1311fec5a98ccfbc8b054f6d9286543a0dc9874c6b/merged major:0 minor:49 fsType:overlay blockSize:0} overlay_0-490:{mountpoint:/var/lib/containers/storage/overlay/0fa69a335f4e8effff9e4a320dd6335890c40f5ad6faaa9263ebbc4f05f52111/merged major:0 minor:490 fsType:overlay blockSize:0} overlay_0-492:{mountpoint:/var/lib/containers/storage/overlay/828f7e1b4d1cd6fee9fd7c8c6f8cdfbc51c711bb744a938168e6e94405f1c02e/merged major:0 minor:492 fsType:overlay blockSize:0} overlay_0-493:{mountpoint:/var/lib/containers/storage/overlay/d02eede380c86495d6824e128e061842e07db018b775bf68ff64a993732d2995/merged major:0 minor:493 fsType:overlay blockSize:0} overlay_0-496:{mountpoint:/var/lib/containers/storage/overlay/c2cbc5fa36d84fd5dcc24b243563e4e789efcdeb3c8eb7b52864d26139e792b6/merged major:0 minor:496 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/eaf59ee7452d6a5b424a03de191891ff329c03465bb526fa3ebd03d2e93fad9e/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/0a5e792ab652777fedb77bf7771fe10d692ecf1f78ac689ed293cb351382bd68/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/67a1b4cf357fa095d0ee506c59a6c354a7bb5e2d5cbfa305c4d65ccbf6d1718e/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/3c32edfa39e3d21ea647c7e83bdb1809e94507290d29a41bbdd9204689e7a2e7/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-529:{mountpoint:/var/lib/containers/storage/overlay/2d243e766d647eee81781ba70e991bf6b2572fab78ffe7f0218575a5916ae29e/merged major:0 minor:529 fsType:overlay blockSize:0} overlay_0-53:{mountpoint:/var/lib/containers/storage/overlay/e66d0635d6371a028754efe854365a85e4f76e5ba6c1a6a827a5ee64a74bbfb4/merged major:0 minor:53 fsType:overlay blockSize:0} overlay_0-533:{mountpoint:/var/lib/containers/storage/overlay/8b188b851641d92ca048b40f7715e55176598e3443766480387ed6a47a134ca4/merged major:0 minor:533 fsType:overlay blockSize:0} overlay_0-54:{mountpoint:/var/lib/containers/storage/overlay/2c6c9528f460788a2f23113aec61c1a0b7b824db93a9dd16a5cff35983c31062/merged major:0 minor:54 fsType:overlay blockSize:0} overlay_0-547:{mountpoint:/var/lib/containers/storage/overlay/c524dd40acc95175f2bd60e7ea76a351fd21383547006d01eeb2cec7d370fcef/merged major:0 minor:547 fsType:overlay blockSize:0} overlay_0-548:{mountpoint:/var/lib/containers/storage/overlay/e5b8c6ac9a0660b235f3609b91706dea98ab58a847b4d50f23f54427681a0e12/merged major:0 minor:548 fsType:overlay blockSize:0} overlay_0-57:{mountpoint:/var/lib/containers/storage/overlay/95e6a7cd1801f1736fb828827e9ff2edf55411eab805888b7c0cf257e2b3b6fb/merged major:0 minor:57 fsType:overlay blockSize:0} overlay_0-578:{mountpoint:/var/lib/containers/storage/overlay/7a1fac24f68a32e1c495456b10d2adafff8a92e2c92ea450730d1a5294ac5cbe/merged major:0 minor:578 fsType:overlay blockSize:0} overlay_0-580:{mountpoint:/var/lib/containers/storage/overlay/34bd7dc8a5692c28f8373b5546fa802d9148331145992e68f844bb82ccb440cf/merged major:0 minor:580 fsType:overlay blockSize:0} overlay_0-582:{mountpoint:/var/lib/containers/storage/overlay/f321ffe789b9fb491e5cf498669d9cf38efb698e61c77f8c59266d71f71fa8b0/merged major:0 minor:582 fsType:overlay blockSize:0} overlay_0-585:{mountpoint:/var/lib/containers/storage/overlay/6b04d64560e22f313453f513a626767908049abc56495c070c5538a058d6518c/merged major:0 minor:585 fsType:overlay blockSize:0} overlay_0-587:{mountpoint:/var/lib/containers/storage/overlay/0e524b00edec0e3e9ac3380527fffcec1ac2a88d60fda70440ead3c99bc86fc1/merged major:0 minor:587 fsType:overlay blockSize:0} overlay_0-589:{mountpoint:/var/lib/containers/storage/overlay/81f9479df5dce29688d773cfcdebc7ffe65bb76ab52edfd40e20000db295af68/merged major:0 minor:589 fsType:overlay blockSize:0} overlay_0-594:{mountpoint:/var/lib/containers/storage/overlay/af64c76b1c2ed5818c1a52b4703192cf513c214465fc44e13a11eeac18111df8/merged major:0 minor:594 fsType:overlay blockSize:0} overlay_0-596:{mountpoint:/var/lib/containers/storage/overlay/4dfb2556c2ea480b86fa90355346fa2bc56673cde81ecd1b3c5875977bba9d7a/merged major:0 minor:596 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/693ff179785d8bcb514b5cce1cd8fa72974066f2a2ebce268e2417e84a82d330/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-602:{mountpoint:/var/lib/containers/storage/overlay/7b50599412bf99df764bc5127c74e74c6dcbfc9127957578395cf3fe36d9914b/merged major:0 minor:602 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/bd7af0e35940b901e9a5b8eb481bfa1884c1ad92abeaf0461f4c0b73b378f0ed/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-618:{mountpoint:/var/lib/containers/storage/overlay/4e94d96c15ecf9225dab45f62dd84409e9a9eda2827c3d631492398d9f764dec/merged major:0 minor:618 fsType:overlay blockSize:0} overlay_0-626:{mountpoint:/var/lib/containers/storage/overlay/ceb99293bb8921a26652d573b40971b8ecbff482572130aaa98c5b274f37408f/merged major:0 minor:626 fsType:overlay blockSize:0} overlay_0-629:{mountpoint:/var/lib/containers/storage/overlay/1929eb9c53b45623f3b22071758a5ec45bda819bfe08a9cb244e097772c9751f/merged major:0 minor:629 fsType:overlay blockSize:0} overlay_0-63:{mountpoint:/var/lib/containers/storage/overlay/6c8aab46755bb1d52d89ca6bec8cb5b832f6cc27bed2250c822da08fbf80892d/merged major:0 minor:63 fsType:overlay blockSize:0} overlay_0-632:{mountpoint:/var/lib/containers/storage/overlay/ca17bde0efe6be0ea0cdb87ac861701cef04b6cc64373de5aa724147586c05ff/merged major:0 minor:632 fsType:overlay blockSize:0} overlay_0-633:{mountpoint:/var/lib/containers/storage/overlay/df522d39c739c322bcf78c326ec6b7c8a561d6d44468e92e6a41f9f8c6ffd0e1/merged major:0 minor:633 fsType:overlay blockSize:0} overlay_0-635:{mountpoint:/var/lib/containers/storage/overlay/29a5709078f5c2fdbc30987b4b0adf0f13eb5e41344064cb3d1c1fb7b975c3a4/merged major:0 minor:635 fsType:overlay blockSize:0} overlay_0-637:{mountpoint:/var/lib/containers/storage/overlay/b39962df1eb7b8903d7953a89aeaad19064a59e6c00676543dc78e08d52a7f5c/merged major:0 minor:637 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/6536b00f55f27d1522704ade4fcab24bb3acd8f831623c1b2e972bba9cd3b3a2/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-640:{mountpoint:/var/lib/containers/storage/overlay/e0ec97407efdfdbbbf696ac2948ad55ff2b0698ca1d5377cae74c3e3b6f40c81/merged major:0 minor:640 fsType:overlay blockSize:0} overlay_0-648:{mountpoint:/var/lib/containers/storage/overlay/cd95d24981e9ee0be871288e617f504f20b29c01ede10dab865ba79c9e82166a/merged major:0 minor:648 fsType:overlay blockSize:0} overlay_0-650:{mountpoint:/var/lib/containers/storage/overlay/1a403bfe93850593025719bfdf5be79f679584e0354945bea30310b4bed3ba3f/merged major:0 minor:650 fsType:overlay blockSize:0} overlay_0-652:{mountpoint:/var/lib/containers/storage/overlay/2febd73b6ab06dade4d0a1acc8b40ab930581de174a0eb032840d5b4aa236482/merged major:0 minor:652 fsType:overlay blockSize:0} overlay_0-659:{mountpoint:/var/lib/containers/storage/overlay/ce2c1c6b713efd29728f1a4a02f7fbafccd75c598db95e93c51cbac4bad5853f/merged major:0 minor:659 fsType:overlay blockSize:0} overlay_0-662:{mountpoint:/var/lib/containers/storage/overlay/7c0ea916ea3e25831c3c7952a742126c02379cad0427ec829e2f5639627b2895/merged major:0 minor:662 fsType:overlay blockSize:0} overlay_0-68:{mountpoint:/var/lib/containers/storage/overlay/74a460aac747b9740851df5f8ae881bec5fc4a601a559c2294e492e8daf03629/merged major:0 minor:68 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/1858e5e62f5a53c8f25ad0cf1e4c682fd9eae2cd62c928c877d4f2405e79c20e/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-70:{mountpoint:/var/lib/containers/storage/overlay/5798cc04dcff2b97c8f1a60964ee826dba33066864d64bfaa6f23ecd9da9d2ca/merged major:0 minor:70 fsType:overlay blockSize:0} overlay_0-702:{mountpoint:/var/lib/containers/storage/overlay/941491e8c3dae092088f8701a4ba1cd8ea2328314a8ea01193088a3c0156e652/merged major:0 minor:702 fsType:overlay blockSize:0} overlay_0-72:{mountpoint:/var/lib/containers/storage/overlay/f1c3e4f6e8945db9ffd326342d2ff61a7a4b810051db700b0762a45197001f0e/merged major:0 minor:72 fsType:overlay blockSize:0} overlay_0-720:{mountpoint:/var/lib/containers/storage/overlay/5ce5468d0c738cdb76ab7a6998ba944837be669a4d3ce006807a0c34562f6429/merged major:0 minor:720 fsType:overlay blockSize:0} overlay_0-722:{mountpoint:/var/lib/containers/storage/overlay/a07c568e3f94679debb56be091f27d11e856a74873a3600bdb2e386ef1ca2c83/merged major:0 minor:722 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/76e451b7612ea57247c5e86e6495c8948b4bd21401283664a6cc2e9046b6aa8d/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-742:{mountpoint:/var/lib/containers/storage/overlay/170706ef7971f39a93d1ce97c620dfc043b156313bc3636996129f803b72f574/merged major:0 minor:742 fsType:overlay blockSize:0} overlay_0-744:{mountpoint:/var/lib/containers/storage/overlay/9568a0b57b43f75ed2512f997d1214ace99d7034f32adb4519363c7dae8875b0/merged major:0 minor:744 fsType:overlay blockSize:0} overlay_0-745:{mountpoint:/var/lib/containers/storage/overlay/57955660b82a52de2e37f5a7efeeb16493618917223bd20225242b2670e0db8a/merged major:0 minor:745 fsType:overlay blockSize:0} overlay_0-753:{mountpoint:/var/lib/containers/storage/overlay/d87b25b3ecdca4dac9469d78f661a280baebf73cc779fd04d79cd0acee97a346/merged major:0 minor:753 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/55f94783fc2c668f413df1742319f6976358b9ea16f2437378d840f58923324f/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-770:{mountpoint:/var/lib/containers/storage/overlay/b92f2324ebdc729f066fe4e659afcfc179eec8259d05c38b1eca822fd8cd058a/merged major:0 minor:770 fsType:overlay blockSize:0} overlay_0-772:{mountpoint:/var/lib/containers/storage/overlay/6ec9c8966cb6212d8a91c27cf35cff104752c01bd9d1fb6ea2f6092b1f1e4097/merged major:0 minor:772 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/4332e001b43b2469b74ac2effc0852c77a13608c643f59542801e285f8cbeee0/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-778:{mountpoint:/var/lib/containers/storage/overlay/c8b2caa08eb275dd50512e398406d7d02bca411d590c6a31509c9dc74a776efc/merged major:0 minor:778 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/a5228a6a216820692deafab6ddb318de5065a3ebf898a7a811df6d23dc04e53b/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-780:{mountpoint:/var/lib/containers/storage/overlay/660b7f6f80c373afd12ce281ae11919f789c7c1b25a5e53039799b9b9501c09a/merged major:0 minor:780 fsType:overlay blockSize:0} overlay_0-787:{mountpoint:/var/lib/containers/storage/overlay/fc3b870c72a186781102ec269a435fc7a72b5a93eb3a0da1dc672f74d144e545/merged major:0 minor:787 fsType:overlay blockSize:0} overlay_0-789:{mountpoint:/var/lib/containers/storage/overlay/8f19963c7c352b2be4433df376169d2c334fb6ce8c748253617503693a5d1449/merged major:0 minor:789 fsType:overlay blockSize:0} overlay_0-79:{mountpoint:/var/lib/containers/storage/overlay/212d55de77eb88aea4b50dd2e803930ef0dc6089d94f893987db2b215cbcc159/merged major:0 minor:79 fsType:overlay blockSize:0} overlay_0-794:{mountpoint:/var/lib/containers/storage/overlay/53db59a9ed9168d31376bb4d22a6a131526a3ea8c764fc6a059f9f3d1fa30dcf/merged major:0 minor:794 fsType:overlay blockSize:0} overlay_0-796:{mountpoint:/var/lib/containers/storage/overlay/b731b99d94460816b375db1c112357472207e10a64df35b74a9fca2bcddd964d/merged major:0 minor:796 fsType:overlay blockSize:0} overlay_0-803:{mountpoint:/var/lib/containers/storage/overlay/183ec9b74ebb2a9f5d80b574dbb5ebaffec0f95180cd4ff634fa4f101f34ae9c/merged major:0 minor:803 fsType:overlay blockSize:0} overlay_0-813:{mountpoint:/var/lib/containers/storage/overlay/e22d3efe07f476de96a913fcec9821029fe968c154265258fd6691cfb5e089a6/merged major:0 minor:813 fsType:overlay blockSize:0} overlay_0-819:{mountpoint:/var/lib/containers/storage/overlay/085180fe07e460f596a682e420db389e9d90927741b5a1a5bcfbe4ea621e3ec9/merged major:0 minor:819 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/bbfc423ac6ab7da163cca65130c7888634de291dcc4b3016162a735e3100ea0d/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-826:{mountpoint:/var/lib/containers/storage/overlay/8fc872e5454101e7af10480d734e9477821deadb8d7262a43a9aa66c33fbd82b/merged major:0 minor:826 fsType:overlay blockSize:0} overlay_0-831:{mountpoint:/var/lib/containers/storage/overlay/91a7de04fc431142dad99d9da0aa456e903c2b9762643d5944fbb45bc333da24/merged major:0 minor:831 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/4757660ed79780a70d96bb23dbb5cadd5c84268255f146f01dcc1d531d9b4bcc/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-847:{mountpoint:/var/lib/containers/storage/overlay/0f38c7303db7e446083e1dbef6be359d82e33fe406e5353e6edfbd5b1a959fe3/merged major:0 minor:847 fsType:overlay blockSize:0} overlay_0-852:{mountpoint:/var/lib/containers/storage/overlay/5c857782bc60f6da5d53c3e6ce8c909367b912c27eae7a2ac88016ec04fc2876/merged major:0 minor:852 fsType:overlay blockSize:0} overlay_0-857:{mountpoint:/var/lib/containers/storage/overlay/5eee8dfe6fdc21e58ba7c3e9f26b02a860a2b3263f64e363491db1607e265751/merged major:0 minor:857 fsType:overlay blockSize:0} overlay_0-859:{mountpoint:/var/lib/containers/storage/overlay/f7994d2756c38e71b56c5a0187f9b9eaaf20a93623f4baa42b3153d9da28c2c2/merged major:0 minor:859 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/e5f0774775fcd7e84ca317768e20c860dfeb09a5a7b8186c48908d5773957bdd/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/2e393a9e31e1a59ec778dc802d94cb662d15b390dcbf4b4ba38d40e1d7961740/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-868:{mountpoint:/var/lib/containers/storage/overlay/c95a778028b731c405409773e94bcefb2892d2c079f6ad9f90c8fcb60ffbf8ed/merged major:0 minor:868 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/c397f53229fdb39161318efebca50275f828f2cab8579e83a9424d227de55e4c/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-872:{mountpoint:/var/lib/containers/storage/overlay/13173da442d41b7e5d19c30d12e4688cd7e66c19c206b06a168762f071a2b40f/merged major:0 minor:872 fsType:overlay blockSize:0} overlay_0-874:{mountpoint:/var/lib/containers/storage/overlay/223fde0ab39ade989849e64937ae5adb2b90d4e5a5f76df6f1c505758112a438/merged major:0 minor:874 fsType:overlay blockSize:0} overlay_0-878:{mountpoint:/var/lib/containers/storage/overlay/ec951537d2c7800a0844b041c3b456b91ba02cf1a5896a357120bbeddb108271/merged major:0 minor:878 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/5afcca13bf78840e23bf6f6a2beff3bc4f2e81a9594f5b67efed270986934119/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-880:{mountpoint:/var/lib/containers/storage/overlay/471eafa4e09fd030c38cb08586f10b5853a8fa4f27d1a9be40dc6af645615acd/merged major:0 minor:880 fsType:overlay blockSize:0} overlay_0-882:{mountpoint:/var/lib/containers/storage/overlay/85899bc24c0802739b7fd8562bb8c81f3dea3a468a31d8c36d5b0f7ae66a99b2/merged major:0 minor:882 fsType:overlay blockSize:0} overlay_0-884:{mountpoint:/var/lib/containers/storage/ Mar 08 00:31:34.697626 master-0 kubenswrapper[23041]: overlay/218684ae9dfa7fbaa09fc573b206f4c79f84c9024d856f6aff5d925d8feb29ca/merged major:0 minor:884 fsType:overlay blockSize:0} overlay_0-886:{mountpoint:/var/lib/containers/storage/overlay/e54cbf763fb3e3946bd3e5d60c6456cc29094b714a20a8dc684936d50f221a8b/merged major:0 minor:886 fsType:overlay blockSize:0} overlay_0-888:{mountpoint:/var/lib/containers/storage/overlay/6b6f9dc69fa610f9c1380d7313934ce66e8dcfb97a379615f3855c3c3500bc27/merged major:0 minor:888 fsType:overlay blockSize:0} overlay_0-897:{mountpoint:/var/lib/containers/storage/overlay/fffde398465d933e2534dde07aec1c50c5de09a371cdad2f68264d083d014de1/merged major:0 minor:897 fsType:overlay blockSize:0} overlay_0-899:{mountpoint:/var/lib/containers/storage/overlay/f902cfeba67d6aea6a45b7502fdfc8a470caab37b537bd53bbae6531333cd9ca/merged major:0 minor:899 fsType:overlay blockSize:0} overlay_0-916:{mountpoint:/var/lib/containers/storage/overlay/79b25ddbf8de4148c2f3f0c653bad1ef463b2d5a99def50e6c2e2b4f5e762857/merged major:0 minor:916 fsType:overlay blockSize:0} overlay_0-919:{mountpoint:/var/lib/containers/storage/overlay/8b11ecc4d7b08706faf14a733045d043cbddb6e502bc4d0030d541e202bdac2e/merged major:0 minor:919 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/0e4c7902b5d9084156815693887a57d848e167eadbd1ff9acd57bd2fa22e83f1/merged major:0 minor:93 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/da426362378ef154546353982f8492b82cf0f06377f653d511ff687c32d9f046/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-953:{mountpoint:/var/lib/containers/storage/overlay/e4d484cdb4cb6f15940fed602a5447fad5f908489cd380cc4497d5f33d780175/merged major:0 minor:953 fsType:overlay blockSize:0} overlay_0-958:{mountpoint:/var/lib/containers/storage/overlay/8f8dcf6575b4ce4a8c3fa9cfa258af53600392aa68275df56fe0ca6cc50bd97c/merged major:0 minor:958 fsType:overlay blockSize:0} overlay_0-961:{mountpoint:/var/lib/containers/storage/overlay/c081c0d7db312518e928448079b32f9612cfd26a5f78c88160d8b95c46c5262f/merged major:0 minor:961 fsType:overlay blockSize:0} overlay_0-975:{mountpoint:/var/lib/containers/storage/overlay/cf76adc14e5d9598a0c4c45bfb7ceef7851037ae7c6d74c15d13678957df842f/merged major:0 minor:975 fsType:overlay blockSize:0} overlay_0-977:{mountpoint:/var/lib/containers/storage/overlay/c5bd5157820f98dd3786886c87ae833a65d265b8ee5c8f3ca438255074a2b09f/merged major:0 minor:977 fsType:overlay blockSize:0} overlay_0-979:{mountpoint:/var/lib/containers/storage/overlay/439efcf6dfee569f2630dcc25f7124e0da66bf96e2e4fee66c6c408fc0683f80/merged major:0 minor:979 fsType:overlay blockSize:0} overlay_0-997:{mountpoint:/var/lib/containers/storage/overlay/4b8896b0360277a7e1c5b5721fc4a53836b6f90d8822035156a4ed1de2eaa772/merged major:0 minor:997 fsType:overlay blockSize:0} overlay_0-999:{mountpoint:/var/lib/containers/storage/overlay/e7ff96048717cd94140d732912e50776c82b346b74dede6743ee55699984671b/merged major:0 minor:999 fsType:overlay blockSize:0}] Mar 08 00:31:34.739622 master-0 kubenswrapper[23041]: I0308 00:31:34.736899 23041 manager.go:217] Machine: {Timestamp:2026-03-08 00:31:34.735307366 +0000 UTC m=+0.208144010 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3fb2a1568fb24853b5e4190e9ed87031 SystemUUID:3fb2a156-8fb2-4853-b5e4-190e9ed87031 BootID:ae637101-d6c8-4837-b1bb-2909ed5c1c9d Filesystems:[{Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c/userdata/shm DeviceMajor:0 DeviceMinor:214 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~projected/kube-api-access-k88m9 DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:458 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:459 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3/userdata/shm DeviceMajor:0 DeviceMinor:955 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/874da80b3858b9b5a8a2258c3b83f19f5f0c80010ec82d07a7dc18d61c4292fa/userdata/shm DeviceMajor:0 DeviceMinor:576 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-780 DeviceMajor:0 DeviceMinor:780 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-227 DeviceMajor:0 DeviceMinor:227 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-880 DeviceMajor:0 DeviceMinor:880 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1092 DeviceMajor:0 DeviceMinor:1092 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bb8dfd749824585a5971cc6ceb0409c06052a233c71d6156a9b5d20725022dcf/userdata/shm DeviceMajor:0 DeviceMinor:477 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d/userdata/shm DeviceMajor:0 DeviceMinor:845 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~projected/kube-api-access-2ggmz DeviceMajor:0 DeviceMinor:972 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1001 DeviceMajor:0 DeviceMinor:1001 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-874 DeviceMajor:0 DeviceMinor:874 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6/userdata/shm DeviceMajor:0 DeviceMinor:1140 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-453 DeviceMajor:0 DeviceMinor:453 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~projected/kube-api-access-vb4n9 DeviceMajor:0 DeviceMinor:758 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~projected/kube-api-access-pfdxc DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~projected/kube-api-access-chnhh DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-886 DeviceMajor:0 DeviceMinor:886 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:726 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-632 DeviceMajor:0 DeviceMinor:632 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-466 DeviceMajor:0 DeviceMinor:466 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-443 DeviceMajor:0 DeviceMinor:443 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-872 DeviceMajor:0 DeviceMinor:872 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-582 DeviceMajor:0 DeviceMinor:582 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~projected/kube-api-access-zgqmb DeviceMajor:0 DeviceMinor:805 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e/volumes/kubernetes.io~projected/kube-api-access-gkh52 DeviceMajor:0 DeviceMinor:429 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/kube-api-access-rt9pm DeviceMajor:0 DeviceMinor:611 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b/volumes/kubernetes.io~projected/kube-api-access-ll99v DeviceMajor:0 DeviceMinor:313 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1064 DeviceMajor:0 DeviceMinor:1064 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-635 DeviceMajor:0 DeviceMinor:635 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~projected/kube-api-access-cw6xw DeviceMajor:0 DeviceMinor:995 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:436 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-455 DeviceMajor:0 DeviceMinor:455 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dc6431dd72c27cd0cc50f525ef4684b1138ca71254e30382dcc7425a8c604797/userdata/shm DeviceMajor:0 DeviceMinor:765 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008/userdata/shm DeviceMajor:0 DeviceMinor:782 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/56e11e7e-6946-4e11-bce9-e91a721fe4a7/volumes/kubernetes.io~projected/kube-api-access-kmxq9 DeviceMajor:0 DeviceMinor:750 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~projected/kube-api-access-ncncc DeviceMajor:0 DeviceMinor:793 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1027 DeviceMajor:0 DeviceMinor:1027 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88/userdata/shm DeviceMajor:0 DeviceMinor:65 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~projected/kube-api-access-sh6nz DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-789 DeviceMajor:0 DeviceMinor:789 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/55c8d406-5448-4056-ab3c-c8399217c024/volumes/kubernetes.io~projected/kube-api-access-nljwf DeviceMajor:0 DeviceMinor:798 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68/userdata/shm DeviceMajor:0 DeviceMinor:1084 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:562 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/kube-api-access-z9l64 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1086 DeviceMajor:0 DeviceMinor:1086 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:56 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-596 DeviceMajor:0 DeviceMinor:596 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-72 DeviceMajor:0 DeviceMinor:72 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f/userdata/shm DeviceMajor:0 DeviceMinor:1012 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865/userdata/shm DeviceMajor:0 DeviceMinor:384 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-648 DeviceMajor:0 DeviceMinor:648 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-662 DeviceMajor:0 DeviceMinor:662 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:835 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:309 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1080 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1005 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ef0a3c84-98bb-4915-9010-d66fcbeafe09/volumes/kubernetes.io~projected/kube-api-access-8fstf DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:432 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-629 DeviceMajor:0 DeviceMinor:629 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-884 DeviceMajor:0 DeviceMinor:884 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1004 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1157 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:763 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1037 DeviceMajor:0 DeviceMinor:1037 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1168 DeviceMajor:0 DeviceMinor:1168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-882 DeviceMajor:0 DeviceMinor:882 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~projected/kube-api-access-l22cn DeviceMajor:0 DeviceMinor:544 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-787 DeviceMajor:0 DeviceMinor:787 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~projected/kube-api-access-dbdd4 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:543 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97/userdata/shm DeviceMajor:0 DeviceMinor:734 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183/userdata/shm DeviceMajor:0 DeviceMinor:612 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1164 DeviceMajor:0 DeviceMinor:1164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-602 DeviceMajor:0 DeviceMinor:602 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~projected/kube-api-access-8v5hl DeviceMajor:0 DeviceMinor:759 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:468 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-702 DeviceMajor:0 DeviceMinor:702 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-49 DeviceMajor:0 DeviceMinor:49 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/11fc2d0ea92ac8231758b019e771de66de17673da31d79a4aab6fc0b796373e6/userdata/shm DeviceMajor:0 DeviceMinor:438 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced/userdata/shm DeviceMajor:0 DeviceMinor:1017 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1075 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1107 DeviceMajor:0 DeviceMinor:1107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1da0c222-4b59-424f-9817-48673083df00/volumes/kubernetes.io~projected/kube-api-access-txt48 DeviceMajor:0 DeviceMinor:1161 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:535 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1119 DeviceMajor:0 DeviceMinor:1119 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-888 DeviceMajor:0 DeviceMinor:888 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-388 DeviceMajor:0 DeviceMinor:388 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-720 DeviceMajor:0 DeviceMinor:720 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-916 DeviceMajor:0 DeviceMinor:916 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-975 DeviceMajor:0 DeviceMinor:975 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-580 DeviceMajor:0 DeviceMinor:580 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a68be094b9128e17cfcb273f66f3867ebf81ebb395668f57f098ee489c8a0035/userdata/shm DeviceMajor:0 DeviceMinor:728 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~projected/kube-api-access-crfg9 DeviceMajor:0 DeviceMinor:464 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-171 DeviceMajor:0 DeviceMinor:171 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:967 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/531e9339-968c-47bf-b8ea-c44d9ceef4b3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:457 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29/userdata/shm DeviceMajor:0 DeviceMinor:821 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-659 DeviceMajor:0 DeviceMinor:659 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~projected/kube-api-access-zvhx4 DeviceMajor:0 DeviceMinor:854 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f/userdata/shm DeviceMajor:0 DeviceMinor:973 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1166 DeviceMajor:0 DeviceMinor:1166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2b1a69b5-c946-495d-ae02-c56f788279e8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e97435ee-522e-427d-9efc-40bc3d2b0d02/volumes/kubernetes.io~projected/kube-api-access-bv9fl DeviceMajor:0 DeviceMinor:375 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287/userdata/shm DeviceMajor:0 DeviceMinor:817 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-953 DeviceMajor:0 DeviceMinor:953 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:785 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~projected/kube-api-access-44jml DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-449 DeviceMajor:0 DeviceMinor:449 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8120e57311950fccd1253a23002276e099126c35ade58bd1fc3115f27615d8d/userdata/shm DeviceMajor:0 DeviceMinor:592 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-589 DeviceMajor:0 DeviceMinor:589 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/365dc4ac-fbc8-4589-a799-8327b3ebd0a5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-487 DeviceMajor:0 DeviceMinor:487 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36/userdata/shm DeviceMajor:0 DeviceMinor:591 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-772 DeviceMajor:0 DeviceMinor:772 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-794 DeviceMajor:0 DeviceMinor:794 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494/userdata/shm DeviceMajor:0 DeviceMinor:1162 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-451 DeviceMajor:0 DeviceMinor:451 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-585 DeviceMajor:0 DeviceMinor:585 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a/userdata/shm DeviceMajor:0 DeviceMinor:480 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-54 DeviceMajor:0 DeviceMinor:54 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aaafa12a616f7369af11bbeebe18962338e3a83e1b72c0a692864a7176225e0a/userdata/shm DeviceMajor:0 DeviceMinor:391 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-878 DeviceMajor:0 DeviceMinor:878 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-70 DeviceMajor:0 DeviceMinor:70 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-897 DeviceMajor:0 DeviceMinor:897 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f860ea80aed55d2d8aefcd014e94c8e07b481ea1bac54429f957dafad3d193dc/userdata/shm DeviceMajor:0 DeviceMinor:1014 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~projected/kube-api-access-2f9kl DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:559 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ce67cd1e37e90c976b5eb1d98a8adbdd3c36380a0d4d75edb38584db8eeda1f5/userdata/shm DeviceMajor:0 DeviceMinor:545 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:420 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-979 DeviceMajor:0 DeviceMinor:979 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~projected/kube-api-access-bcl7q DeviceMajor:0 DeviceMinor:383 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~projected/kube-api-access-qv5kd DeviceMajor:0 DeviceMinor:806 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-770 DeviceMajor:0 DeviceMinor:770 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1152 DeviceMajor:0 DeviceMinor:1152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:540 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1144 DeviceMajor:0 DeviceMinor:1144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1079 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919/userdata/shm DeviceMajor:0 DeviceMinor:260 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-426 DeviceMajor:0 DeviceMinor:426 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:431 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~projected/kube-api-access-9bmgb DeviceMajor:0 DeviceMinor:565 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-493 DeviceMajor:0 DeviceMinor:493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cddeda960c60a71faf688d26e861f0212c8666ffc3672e89502d43761b93cd2/userdata/shm DeviceMajor:0 DeviceMinor:799 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-412 DeviceMajor:0 DeviceMinor:412 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cc242574263ef7c849076452db10d6f32fa75aeb983a9e0f9150bc85db0911a/userdata/shm DeviceMajor:0 DeviceMinor:843 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1062 DeviceMajor:0 DeviceMinor:1062 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-547 DeviceMajor:0 DeviceMinor:547 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-330 DeviceMajor:0 DeviceMinor:330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-803 DeviceMajor:0 DeviceMinor:803 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000/userdata/shm DeviceMajor:0 DeviceMinor:1088 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2ac55f03-dd6f-4ead-bacc-c69aeca146dc/volumes/kubernetes.io~projected/kube-api-access-8d4xz DeviceMajor:0 DeviceMinor:385 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-548 DeviceMajor:0 DeviceMinor:548 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-57 DeviceMajor:0 DeviceMinor:57 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ae061e84-5e6a-415c-a735-fa14add7318a/volumes/kubernetes.io~projected/kube-api-access-qznbf DeviceMajor:0 DeviceMinor:1083 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b/userdata/shm DeviceMajor:0 DeviceMinor:230 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:841 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~projected/kube-api-access-x9fv4 DeviceMajor:0 DeviceMinor:1006 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~projected/kube-api-access-6rfqt DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0e52cbdc-1d46-4cc9-85ee-535aa449992f/volumes/kubernetes.io~projected/kube-api-access-xqkqn DeviceMajor:0 DeviceMinor:274 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e302bc0b-7560-4f84-813f-d966c2dbe47c/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:583 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-53 DeviceMajor:0 DeviceMinor:53 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a2af0127ad556015336cd256817276cc9d6a8a08dbbf295a1bf7821d7309d19c/userdata/shm DeviceMajor:0 DeviceMinor:573 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-831 DeviceMajor:0 DeviceMinor:831 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1142 DeviceMajor:0 DeviceMinor:1142 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-753 DeviceMajor:0 DeviceMinor:753 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-847 DeviceMajor:0 DeviceMinor:847 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-529 DeviceMajor:0 DeviceMinor:529 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e47d8d2ffbca29135c63c0ec58db9d105e81fa73da896958637e9f0815629eb/userdata/shm DeviceMajor:0 DeviceMinor:570 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-819 DeviceMajor:0 DeviceMinor:819 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-868 DeviceMajor:0 DeviceMinor:868 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~projected/kube-api-access-6qshd DeviceMajor:0 DeviceMinor:469 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/614f0a0f-5853-4cf6-bd3d-174141f0f1e2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:735 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1137 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1112 DeviceMajor:0 DeviceMinor:1112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1751db13-b792-43e2-8459-d1d4a0164dfb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:398 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/volumes/kubernetes.io~projected/kube-api-access-gh2h6 DeviceMajor:0 DeviceMinor:542 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-997 DeviceMajor:0 DeviceMinor:997 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1102 DeviceMajor:0 DeviceMinor:1102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-496 DeviceMajor:0 DeviceMinor:496 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-640 DeviceMajor:0 DeviceMinor:640 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0d0cb126-341c-4215-ad2e-a008193cc0b5/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1003 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-474 DeviceMajor:0 DeviceMinor:474 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2ce2ea7-bd25-4294-8f3a-11ce53577830/volumes/kubernetes.io~projected/kube-api-access-9qpkj DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-533 DeviceMajor:0 DeviceMinor:533 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-722 DeviceMajor:0 DeviceMinor:722 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d70f4efb-e61a-4e88-a271-2f4af21ecdf3/volumes/kubernetes.io~projected/kube-api-access-pt6w4 DeviceMajor:0 DeviceMinor:837 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~projected/kube-api-access-ntks9 DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ad37f40-c533-4a1e-882a-2e0973eff86d/volumes/kubernetes.io~projected/kube-api-access-6wrq9 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-999 DeviceMajor:0 DeviceMinor:999 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db164b32-e20e-4d07-a9ae-98720321621d/volumes/kubernetes.io~projected/kube-api-access-89wj5 DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5cf5a2ef-2498-40a0-a189-0753076fd3b6/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:531 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-742 DeviceMajor:0 DeviceMinor:742 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9d810f7f-258a-47ce-9f99-7b1d93388aee/volumes/kubernetes.io~projected/kube-api-access-dz874 DeviceMajor:0 DeviceMinor:836 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/volumes/kubernetes.io~projected/kube-api-access-d5knc DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e78057cd-5120-4a12-934d-9fed51e1bdc0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:776 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7/userdata/shm DeviceMajor:0 DeviceMinor:855 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-394 DeviceMajor:0 DeviceMinor:394 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-618 DeviceMajor:0 DeviceMinor:618 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1020 DeviceMajor:0 DeviceMinor:1020 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1117 DeviceMajor:0 DeviceMinor:1117 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c7097f64-1709-4f76-a725-5a6c6cc5919b/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:840 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-79 DeviceMajor:0 DeviceMinor:79 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1133 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-100 DeviceMajor:0 DeviceMinor:100 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e21ecaa295b51fd30f3e30feccdaaffb5d26d81a05305635fb9f903bb9b8a90e/userdata/shm DeviceMajor:0 DeviceMinor:475 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-633 DeviceMajor:0 DeviceMinor:633 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0d7955b7085045599d0a7ea45ff20f907bc225ec27c46ed3dcc33b59207b912/userdata/shm DeviceMajor:0 DeviceMinor:631 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/af391724-079a-4bac-a89e-978ffd471763/volumes/kubernetes.io~projected/kube-api-access-gkl4m DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d/userdata/shm DeviceMajor:0 DeviceMinor:1081 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8/userdata/shm DeviceMajor:0 DeviceMinor:89 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/03f4bafb-c270-428a-bacf-8a424b3d1a05/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:427 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-852 DeviceMajor:0 DeviceMinor:852 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f3cab32904f1f3dc9eae1dc7b47ec8d51b63661baeb9517ad66b59248d52dfef/userdata/shm DeviceMajor:0 DeviceMinor:990 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1025 DeviceMajor:0 DeviceMinor:1025 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a68ad726-392e-4a7a-a384-409108df9c8b/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:784 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1094 DeviceMajor:0 DeviceMinor:1094 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1abf904b-0b8d-4d61-8231-0e8d00933192/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:435 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-63 DeviceMajor:0 DeviceMinor:63 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:808 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-919 DeviceMajor:0 DeviceMinor:919 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:807 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~projected/kube-a Mar 08 00:31:34.740146 master-0 kubenswrapper[23041]: pi-access-stxt7 DeviceMajor:0 DeviceMinor:1077 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-342 DeviceMajor:0 DeviceMinor:342 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7317ceda-df6f-4826-aa1a-15304c2b0fcd/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:994 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-745 DeviceMajor:0 DeviceMinor:745 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e76bc134-2a88-4f92-9aa7-f6854941b98f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd2c01cdd304d39e575ca69d83c243fee0060006da5d42ff4d10f498f54d4b60/userdata/shm DeviceMajor:0 DeviceMinor:338 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c0511cfa10b44562c51d17ac29eccf8315f318be9fcd77f37c978f1bbeeb8000/userdata/shm DeviceMajor:0 DeviceMinor:568 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:560 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-483 DeviceMajor:0 DeviceMinor:483 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9/volumes/kubernetes.io~projected/kube-api-access-s99rr DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-857 DeviceMajor:0 DeviceMinor:857 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1016 DeviceMajor:0 DeviceMinor:1016 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e237ed52-5561-44c5-bcb1-de62691d6431/volumes/kubernetes.io~projected/kube-api-access-t99pg DeviceMajor:0 DeviceMinor:69 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f0660a52e90ffa7a2326892a3e2cda1d66d0d4aba0e60527ee906109c288f588/userdata/shm DeviceMajor:0 DeviceMinor:441 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6999cf38-e317-4727-98c9-d4e348e9e16a/volumes/kubernetes.io~projected/kube-api-access-pwsqr DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4f5539c1-fb87-42d6-b735-6de53421bb6b/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:402 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f661c7de8e4aded6ffb76b6f77c2ac0e5ed6e7e0e7ebfcafe40f9c953ec5ee63/userdata/shm DeviceMajor:0 DeviceMinor:541 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0/volumes/kubernetes.io~projected/kube-api-access-h65c2 DeviceMajor:0 DeviceMinor:842 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/036c8d5e00b57ec77b752ae2bc46eb3d7ff2904d9ebc488665656ab787ecd5a5/userdata/shm DeviceMajor:0 DeviceMinor:566 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-977 DeviceMajor:0 DeviceMinor:977 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-378 DeviceMajor:0 DeviceMinor:378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-813 DeviceMajor:0 DeviceMinor:813 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-489 DeviceMajor:0 DeviceMinor:489 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1090 DeviceMajor:0 DeviceMinor:1090 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4a829558-a672-4dc5-ae20-69884213482f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:109 Capacity:200003584 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07/userdata/shm DeviceMajor:0 DeviceMinor:442 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5993f0db8eb571541ffd45db324c8f25d80729c838e2d7b2910b9b88c3eb3de6/userdata/shm DeviceMajor:0 DeviceMinor:476 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~projected/kube-api-access-mcqn9 DeviceMajor:0 DeviceMinor:555 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1074 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1100 DeviceMajor:0 DeviceMinor:1100 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b/userdata/shm DeviceMajor:0 DeviceMinor:92 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:767 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-414 DeviceMajor:0 DeviceMinor:414 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:724 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d01c21a1-6c2c-49a7-9d85-254662851838/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:610 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-162 DeviceMajor:0 DeviceMinor:162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-327 DeviceMajor:0 DeviceMinor:327 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58333089-2456-4a25-8ba7-6d557eefa177/volumes/kubernetes.io~projected/kube-api-access-hhckc DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-447 DeviceMajor:0 DeviceMinor:447 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-587 DeviceMajor:0 DeviceMinor:587 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes/kubernetes.io~projected/kube-api-access-b66xq DeviceMajor:0 DeviceMinor:1139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:471 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1076 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-744 DeviceMajor:0 DeviceMinor:744 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070/userdata/shm DeviceMajor:0 DeviceMinor:473 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cd06e32b994481471c1008a22765ea8fb7d4c0eac4c1085f974725068e466db7/userdata/shm DeviceMajor:0 DeviceMinor:567 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ff474153830a652e4ddb7aadf249d8bcfad8aa4e41fc72213e841bb0817ffeb/userdata/shm DeviceMajor:0 DeviceMinor:786 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4/userdata/shm DeviceMajor:0 DeviceMinor:810 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1070 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b100ce12-965e-409e-8cdb-8f99ef51a82b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-490 DeviceMajor:0 DeviceMinor:490 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-637 DeviceMajor:0 DeviceMinor:637 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~projected/kube-api-access-kqjt7 DeviceMajor:0 DeviceMinor:729 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3d2e1686-3a30-4021-9c03-02e472bc6ff3/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:795 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-826 DeviceMajor:0 DeviceMinor:826 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f63cb2f-779f-4fde-bf92-cf0414844a77/volumes/kubernetes.io~projected/kube-api-access-wh9cz DeviceMajor:0 DeviceMinor:328 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3fee96d7-75a7-46e4-9707-7bd292f10b84/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-626 DeviceMajor:0 DeviceMinor:626 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00/userdata/shm DeviceMajor:0 DeviceMinor:730 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a229b84-65bd-493b-90dd-b8194f842dc8/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:424 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/460f09d8-a143-48d2-9db0-be247386984a/volumes/kubernetes.io~projected/kube-api-access-vj8sl DeviceMajor:0 DeviceMinor:768 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-958 DeviceMajor:0 DeviceMinor:958 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ec2d22f2-c260-42a6-a9da-ee0f44f42303/volumes/kubernetes.io~projected/kube-api-access-xlzcz DeviceMajor:0 DeviceMinor:91 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1047 DeviceMajor:0 DeviceMinor:1047 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1128 DeviceMajor:0 DeviceMinor:1128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/302cab9bf3dbf255daeb9370ab65a4f19b214019a7009e2da9e307530afd287e/userdata/shm DeviceMajor:0 DeviceMinor:376 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b2548aca-4a9d-4670-a60a-0d6361d1c441/volumes/kubernetes.io~projected/kube-api-access-dvvvn DeviceMajor:0 DeviceMinor:812 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf/userdata/shm DeviceMajor:0 DeviceMinor:350 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-899 DeviceMajor:0 DeviceMinor:899 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1039 DeviceMajor:0 DeviceMinor:1039 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-652 DeviceMajor:0 DeviceMinor:652 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-650 DeviceMajor:0 DeviceMinor:650 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e3f42081-387d-4798-b981-ac232e851bb4/volumes/kubernetes.io~projected/kube-api-access-smnrc DeviceMajor:0 DeviceMinor:764 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-68 DeviceMajor:0 DeviceMinor:68 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/401bbef2-684c-4f55-b2c7-e6184c789e40/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:554 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7da68e85-9170-499d-8050-139ecfac4600/volumes/kubernetes.io~projected/kube-api-access-bg5d9 DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6d770808-d390-41c1-a9d9-fc12b99fa9a9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:561 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-859 DeviceMajor:0 DeviceMinor:859 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ad8b9ea-ba1c-4507-9b70-ce2da170d480/volumes/kubernetes.io~projected/kube-api-access-bxk5x DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/84522c03-fd7b-4be7-9413-84e510b9dc5a/volumes/kubernetes.io~projected/kube-api-access-ht8zb DeviceMajor:0 DeviceMinor:809 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0f496486-70d5-4c5c-b4f3-6cc19427762f/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:522 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3b4f8517-1e54-4b41-ba6b-6c56fe66831a/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:81 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b94acad3-cf4e-443d-80fb-5e68a4074336/volumes/kubernetes.io~projected/kube-api-access-7tml5 DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834/userdata/shm DeviceMajor:0 DeviceMinor:270 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes/kubernetes.io~projected/kube-api-access-4t8np DeviceMajor:0 DeviceMinor:727 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/815fd565-0609-4d8f-ac05-8656f198b008/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:563 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1002 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25/volumes/kubernetes.io~projected/kube-api-access-wllt8 DeviceMajor:0 DeviceMinor:1078 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-594 DeviceMajor:0 DeviceMinor:594 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-961 DeviceMajor:0 DeviceMinor:961 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac523956-c8a3-4794-a1fa-660cd14966bb/volumes/kubernetes.io~projected/kube-api-access-wjcjb DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f/userdata/shm DeviceMajor:0 DeviceMinor:815 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1049 DeviceMajor:0 DeviceMinor:1049 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8/userdata/shm DeviceMajor:0 DeviceMinor:679 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9da3ea5c4393051eef91cb7af969405949bc3c6b97f5782d6bc10af29a80c30d/userdata/shm DeviceMajor:0 DeviceMinor:437 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-492 DeviceMajor:0 DeviceMinor:492 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-343 DeviceMajor:0 DeviceMinor:343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:725 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-796 DeviceMajor:0 DeviceMinor:796 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-578 DeviceMajor:0 DeviceMinor:578 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8/volumes/kubernetes.io~projected/kube-api-access-5q6hn DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-357 DeviceMajor:0 DeviceMinor:357 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/786e30f1-d30a-43e1-85cb-d8ea1495422e/volumes/kubernetes.io~projected/kube-api-access-dvglb DeviceMajor:0 DeviceMinor:1007 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-778 DeviceMajor:0 DeviceMinor:778 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:433 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:036c8d5e00b57ec MacAddress:2e:10:0f:65:08:12 Speed:10000 Mtu:8900} {Name:0ce2140e8d5f4ac MacAddress:86:13:67:23:f4:96 Speed:10000 Mtu:8900} {Name:11fc2d0ea92ac82 MacAddress:5a:b2:6b:a2:e8:16 Speed:10000 Mtu:8900} {Name:1647ce1acf481d1 MacAddress:d6:d6:ee:d3:c9:39 Speed:10000 Mtu:8900} {Name:16a0ef8737c1e24 MacAddress:3e:38:c5:93:a5:16 Speed:10000 Mtu:8900} {Name:1cc242574263ef7 MacAddress:86:43:c6:91:48:3f Speed:10000 Mtu:8900} {Name:1cddeda960c60a7 MacAddress:b6:10:1d:ab:13:93 Speed:10000 Mtu:8900} {Name:27f4354a5f2d519 MacAddress:8e:e4:3c:3b:5a:3f Speed:10000 Mtu:8900} {Name:28355b7f7227fe6 MacAddress:12:53:b4:3e:6f:a3 Speed:10000 Mtu:8900} {Name:2e47d8d2ffbca29 MacAddress:9e:d4:7d:44:0e:b9 Speed:10000 Mtu:8900} {Name:302cab9bf3dbf25 MacAddress:da:83:e6:11:b6:cc Speed:10000 Mtu:8900} {Name:31406fc5b2c5472 MacAddress:52:38:09:1b:34:fb Speed:10000 Mtu:8900} {Name:3824dde14e6e2df MacAddress:3a:63:f9:9b:11:bb Speed:10000 Mtu:8900} {Name:388b509d4fc31b4 MacAddress:62:3c:57:0d:78:25 Speed:10000 Mtu:8900} {Name:3c8994f66c1270d MacAddress:c2:59:ba:f2:c5:74 Speed:10000 Mtu:8900} {Name:3fcfcac3d94a685 MacAddress:5e:85:63:1d:9e:d2 Speed:10000 Mtu:8900} {Name:4297b6122cd668a MacAddress:06:36:f3:99:6c:64 Speed:10000 Mtu:8900} {Name:4ba757467f3e4fa MacAddress:66:03:39:58:ea:a0 Speed:10000 Mtu:8900} {Name:4e0af367cee5aa7 MacAddress:6e:59:b5:78:3a:b1 Speed:10000 Mtu:8900} {Name:55b01a8834cc0e6 MacAddress:32:e4:32:ae:9d:80 Speed:10000 Mtu:8900} {Name:5993f0db8eb5715 MacAddress:a6:61:cd:df:b1:c0 Speed:10000 Mtu:8900} {Name:733e43352408d7f MacAddress:0e:2a:f7:48:43:31 Speed:10000 Mtu:8900} {Name:773f19015576d67 MacAddress:66:ef:6a:d8:e7:05 Speed:10000 Mtu:8900} {Name:78f167041d0e1e5 MacAddress:be:80:19:a7:74:1c Speed:10000 Mtu:8900} {Name:79a6fb0d44533a4 MacAddress:ce:9f:11:c8:86:dc Speed:10000 Mtu:8900} {Name:7bcc330c034a703 MacAddress:da:c0:90:17:ee:c0 Speed:10000 Mtu:8900} {Name:7f2851a3eb6c41b MacAddress:02:b3:70:01:37:a4 Speed:10000 Mtu:8900} {Name:7f9bd3b95fa9a96 MacAddress:72:00:fc:c6:87:63 Speed:10000 Mtu:8900} {Name:813c8ed04b18f30 MacAddress:16:21:d9:74:39:8c Speed:10000 Mtu:8900} {Name:8493b96f9e2317b MacAddress:c6:29:de:1d:f9:78 Speed:10000 Mtu:8900} {Name:874da80b3858b9b MacAddress:3a:fc:26:65:c3:44 Speed:10000 Mtu:8900} {Name:8d0d8e23ae25ced MacAddress:8e:a1:74:b2:a6:67 Speed:10000 Mtu:8900} {Name:8f1055f3dc7c655 MacAddress:0e:11:fb:cf:46:63 Speed:10000 Mtu:8900} {Name:8ff474153830a65 MacAddress:e2:c8:c6:f8:78:fd Speed:10000 Mtu:8900} {Name:90c63e0b66f405a MacAddress:3e:0b:09:42:a5:9d Speed:10000 Mtu:8900} {Name:9da3ea5c4393051 MacAddress:ee:df:94:94:82:0e Speed:10000 Mtu:8900} {Name:a0d7955b7085045 MacAddress:e2:8a:3d:41:b1:7b Speed:10000 Mtu:8900} {Name:a2af0127ad55601 MacAddress:0a:13:49:32:09:d9 Speed:10000 Mtu:8900} {Name:a68be094b9128e1 MacAddress:2e:b2:fb:cf:49:7a Speed:10000 Mtu:8900} {Name:aaafa12a616f736 MacAddress:f6:ef:44:8b:65:1b Speed:10000 Mtu:8900} {Name:bb8dfd749824585 MacAddress:96:41:6d:f5:bf:90 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:22:2c:d1:1b:4a:52 Speed:0 Mtu:8900} {Name:c0511cfa10b4456 MacAddress:96:a1:28:f9:dd:3d Speed:10000 Mtu:8900} {Name:c6dfb6a757149a4 MacAddress:7e:4d:c1:ef:bf:5c Speed:10000 Mtu:8900} {Name:c7b839bc1440105 MacAddress:4e:01:8e:08:22:11 Speed:10000 Mtu:8900} {Name:c9dc377ca2fdac8 MacAddress:aa:a9:a6:46:20:1d Speed:10000 Mtu:8900} {Name:cd06e32b9944814 MacAddress:42:c2:1b:39:8f:02 Speed:10000 Mtu:8900} {Name:dc6431dd72c27cd MacAddress:b6:22:3d:f2:a9:cd Speed:10000 Mtu:8900} {Name:ddbc9d4d3c5ffe0 MacAddress:4a:87:2c:32:af:ae Speed:10000 Mtu:8900} {Name:e0a85ed7ebd2e07 MacAddress:12:5c:85:ab:81:2b Speed:10000 Mtu:8900} {Name:e21ecaa295b51fd MacAddress:7a:32:08:05:d5:96 Speed:10000 Mtu:8900} {Name:e690a192a3d0aa0 MacAddress:0e:41:49:67:b1:69 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:0f:fb:26 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:73:5d:56 Speed:-1 Mtu:9000} {Name:f0660a52e90ffa7 MacAddress:12:c6:29:44:8f:ac Speed:10000 Mtu:8900} {Name:f37ac8237d1707f MacAddress:a2:23:7e:7e:25:d9 Speed:10000 Mtu:8900} {Name:f661c7de8e4aded MacAddress:32:e4:d9:03:ed:ef Speed:10000 Mtu:8900} {Name:fd2c01cdd304d39 MacAddress:22:5b:91:d9:54:ce Speed:10000 Mtu:8900} {Name:ff2ce08940304b5 MacAddress:d2:73:30:fb:7c:31 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:ee:64:ec:05:bf:ed Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 00:31:34.740146 master-0 kubenswrapper[23041]: I0308 00:31:34.739615 23041 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 00:31:34.740146 master-0 kubenswrapper[23041]: I0308 00:31:34.739744 23041 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 00:31:34.740550 master-0 kubenswrapper[23041]: I0308 00:31:34.740163 23041 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 00:31:34.740817 master-0 kubenswrapper[23041]: I0308 00:31:34.740745 23041 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 00:31:34.741454 master-0 kubenswrapper[23041]: I0308 00:31:34.740850 23041 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 00:31:34.741528 master-0 kubenswrapper[23041]: I0308 00:31:34.741497 23041 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 00:31:34.741528 master-0 kubenswrapper[23041]: I0308 00:31:34.741519 23041 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 00:31:34.741605 master-0 kubenswrapper[23041]: I0308 00:31:34.741536 23041 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:31:34.741605 master-0 kubenswrapper[23041]: I0308 00:31:34.741578 23041 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:31:34.741677 master-0 kubenswrapper[23041]: I0308 00:31:34.741641 23041 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:31:34.741818 master-0 kubenswrapper[23041]: I0308 00:31:34.741782 23041 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 00:31:34.741939 master-0 kubenswrapper[23041]: I0308 00:31:34.741897 23041 kubelet.go:418] "Attempting to sync node with API server" Mar 08 00:31:34.741990 master-0 kubenswrapper[23041]: I0308 00:31:34.741951 23041 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 00:31:34.742027 master-0 kubenswrapper[23041]: I0308 00:31:34.741986 23041 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 00:31:34.742027 master-0 kubenswrapper[23041]: I0308 00:31:34.742011 23041 kubelet.go:324] "Adding apiserver pod source" Mar 08 00:31:34.742215 master-0 kubenswrapper[23041]: I0308 00:31:34.742049 23041 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 00:31:34.744411 master-0 kubenswrapper[23041]: I0308 00:31:34.743892 23041 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 00:31:34.744792 master-0 kubenswrapper[23041]: I0308 00:31:34.744743 23041 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 00:31:34.745560 master-0 kubenswrapper[23041]: I0308 00:31:34.745515 23041 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 00:31:34.745891 master-0 kubenswrapper[23041]: I0308 00:31:34.745848 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 00:31:34.745934 master-0 kubenswrapper[23041]: I0308 00:31:34.745902 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 00:31:34.745934 master-0 kubenswrapper[23041]: I0308 00:31:34.745923 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 00:31:34.746024 master-0 kubenswrapper[23041]: I0308 00:31:34.745943 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 00:31:34.746024 master-0 kubenswrapper[23041]: I0308 00:31:34.745974 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 00:31:34.746024 master-0 kubenswrapper[23041]: I0308 00:31:34.745995 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 00:31:34.746130 master-0 kubenswrapper[23041]: I0308 00:31:34.746049 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 00:31:34.746130 master-0 kubenswrapper[23041]: I0308 00:31:34.746071 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 00:31:34.746130 master-0 kubenswrapper[23041]: I0308 00:31:34.746097 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 00:31:34.746130 master-0 kubenswrapper[23041]: I0308 00:31:34.746117 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 00:31:34.746299 master-0 kubenswrapper[23041]: I0308 00:31:34.746167 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 00:31:34.746299 master-0 kubenswrapper[23041]: I0308 00:31:34.746232 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 00:31:34.746368 master-0 kubenswrapper[23041]: I0308 00:31:34.746304 23041 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 00:31:34.747626 master-0 kubenswrapper[23041]: I0308 00:31:34.747578 23041 server.go:1280] "Started kubelet" Mar 08 00:31:34.747868 master-0 kubenswrapper[23041]: I0308 00:31:34.747733 23041 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 00:31:34.748283 master-0 kubenswrapper[23041]: I0308 00:31:34.747945 23041 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 00:31:34.748405 master-0 kubenswrapper[23041]: I0308 00:31:34.748283 23041 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 00:31:34.749343 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 00:31:34.750627 master-0 kubenswrapper[23041]: I0308 00:31:34.749640 23041 server.go:449] "Adding debug handlers to kubelet server" Mar 08 00:31:34.750627 master-0 kubenswrapper[23041]: I0308 00:31:34.749712 23041 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 00:31:34.763455 master-0 kubenswrapper[23041]: I0308 00:31:34.763039 23041 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:31:34.764786 master-0 kubenswrapper[23041]: I0308 00:31:34.764749 23041 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:31:34.784372 master-0 kubenswrapper[23041]: E0308 00:31:34.784283 23041 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 08 00:31:34.784970 master-0 kubenswrapper[23041]: I0308 00:31:34.784928 23041 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 00:31:34.785036 master-0 kubenswrapper[23041]: I0308 00:31:34.784974 23041 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 00:31:34.785284 master-0 kubenswrapper[23041]: I0308 00:31:34.785235 23041 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 00:31:34.785284 master-0 kubenswrapper[23041]: I0308 00:31:34.785272 23041 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 00:31:34.785453 master-0 kubenswrapper[23041]: I0308 00:31:34.785423 23041 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 00:31:34.785453 master-0 kubenswrapper[23041]: I0308 00:31:34.785183 23041 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 00:11:49 +0000 UTC, rotation deadline is 2026-03-08 19:07:34.134143206 +0000 UTC Mar 08 00:31:34.789514 master-0 kubenswrapper[23041]: I0308 00:31:34.785456 23041 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h35m59.348695263s for next certificate rotation Mar 08 00:31:34.789514 master-0 kubenswrapper[23041]: I0308 00:31:34.785972 23041 factory.go:55] Registering systemd factory Mar 08 00:31:34.789514 master-0 kubenswrapper[23041]: I0308 00:31:34.786093 23041 factory.go:221] Registration of the systemd container factory successfully Mar 08 00:31:34.791122 master-0 kubenswrapper[23041]: I0308 00:31:34.791083 23041 factory.go:153] Registering CRI-O factory Mar 08 00:31:34.791122 master-0 kubenswrapper[23041]: I0308 00:31:34.791123 23041 factory.go:221] Registration of the crio container factory successfully Mar 08 00:31:34.791304 master-0 kubenswrapper[23041]: I0308 00:31:34.791243 23041 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 00:31:34.791304 master-0 kubenswrapper[23041]: I0308 00:31:34.791269 23041 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:31:34.791795 master-0 kubenswrapper[23041]: I0308 00:31:34.791272 23041 factory.go:103] Registering Raw factory Mar 08 00:31:34.791795 master-0 kubenswrapper[23041]: I0308 00:31:34.791561 23041 manager.go:1196] Started watching for new ooms in manager Mar 08 00:31:34.792262 master-0 kubenswrapper[23041]: I0308 00:31:34.792191 23041 manager.go:319] Starting recovery of all containers Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802863 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a68ad726-392e-4a7a-a384-409108df9c8b" volumeName="kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802925 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802939 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbcb0196-be5c-44a4-9749-5df9fbeaa718" volumeName="kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802951 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0f496486-70d5-4c5c-b4f3-6cc19427762f" volumeName="kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802963 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" volumeName="kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.802990 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84522c03-fd7b-4be7-9413-84e510b9dc5a" volumeName="kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803002 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ad37f40-c533-4a1e-882a-2e0973eff86d" volumeName="kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803015 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a229b84-65bd-493b-90dd-b8194f842dc8" volumeName="kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803029 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d810f7f-258a-47ce-9f99-7b1d93388aee" volumeName="kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803041 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f71fd39-a16b-47d2-b781-c8ce37bcb9b2" volumeName="kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803053 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803066 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b94acad3-cf4e-443d-80fb-5e68a4074336" volumeName="kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803077 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbcb0196-be5c-44a4-9749-5df9fbeaa718" volumeName="kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803091 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803102 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803114 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" volumeName="kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803124 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803136 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="786e30f1-d30a-43e1-85cb-d8ea1495422e" volumeName="kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803148 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3f42081-387d-4798-b981-ac232e851bb4" volumeName="kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803188 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803220 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f5539c1-fb87-42d6-b735-6de53421bb6b" volumeName="kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803233 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84522c03-fd7b-4be7-9413-84e510b9dc5a" volumeName="kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803245 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a68ad726-392e-4a7a-a384-409108df9c8b" volumeName="kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803259 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7" volumeName="kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803272 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7" volumeName="kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803284 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" volumeName="kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803297 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803311 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803325 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec2d22f2-c260-42a6-a9da-ee0f44f42303" volumeName="kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803337 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" volumeName="kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803374 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56e11e7e-6946-4e11-bce9-e91a721fe4a7" volumeName="kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803389 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803401 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec2d22f2-c260-42a6-a9da-ee0f44f42303" volumeName="kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803413 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803424 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84522c03-fd7b-4be7-9413-84e510b9dc5a" volumeName="kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803439 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7097f64-1709-4f76-a725-5a6c6cc5919b" volumeName="kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803450 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84522c03-fd7b-4be7-9413-84e510b9dc5a" volumeName="kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803464 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803476 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d70f4efb-e61a-4e88-a271-2f4af21ecdf3" volumeName="kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803489 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f5539c1-fb87-42d6-b735-6de53421bb6b" volumeName="kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.803503 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804071 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804091 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804105 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e52cbdc-1d46-4cc9-85ee-535aa449992f" volumeName="kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804118 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbcb0196-be5c-44a4-9749-5df9fbeaa718" volumeName="kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804131 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f63cb2f-779f-4fde-bf92-cf0414844a77" volumeName="kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804142 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b9f4db1-3ba9-49a5-9a65-1d770ee59a65" volumeName="kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804157 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804170 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804184 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56e11e7e-6946-4e11-bce9-e91a721fe4a7" volumeName="kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804290 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804319 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804338 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804351 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804366 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804382 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804397 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d810f7f-258a-47ce-9f99-7b1d93388aee" volumeName="kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804409 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804421 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804434 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804448 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="815fd565-0609-4d8f-ac05-8656f198b008" volumeName="kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804461 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" volumeName="kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804472 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" volumeName="kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804485 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" volumeName="kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804496 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804509 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804521 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f71fd39-a16b-47d2-b781-c8ce37bcb9b2" volumeName="kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804533 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d810f7f-258a-47ce-9f99-7b1d93388aee" volumeName="kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804546 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804559 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b100ce12-965e-409e-8cdb-8f99ef51a82b" volumeName="kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804570 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804583 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804595 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804614 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804626 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804640 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c8d406-5448-4056-ab3c-c8399217c024" volumeName="kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804653 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="815fd565-0609-4d8f-ac05-8656f198b008" volumeName="kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804667 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" volumeName="kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804682 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cf5a2ef-2498-40a0-a189-0753076fd3b6" volumeName="kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804696 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7317ceda-df6f-4826-aa1a-15304c2b0fcd" volumeName="kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804707 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7317ceda-df6f-4826-aa1a-15304c2b0fcd" volumeName="kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804720 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804731 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804743 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" volumeName="kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804758 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804770 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70892c23-554d-466c-a526-90a799439fe0" volumeName="kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804781 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" volumeName="kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804795 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" volumeName="kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804810 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2548aca-4a9d-4670-a60a-0d6361d1c441" volumeName="kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804825 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804837 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh" seLinuxMountContext="" Mar 08 00:31:34.804902 master-0 kubenswrapper[23041]: I0308 00:31:34.804852 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" volumeName="kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.804824 23041 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.804864 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805314 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="401bbef2-684c-4f55-b2c7-e6184c789e40" volumeName="kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805355 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805378 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cf5a2ef-2498-40a0-a189-0753076fd3b6" volumeName="kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805398 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7317ceda-df6f-4826-aa1a-15304c2b0fcd" volumeName="kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805416 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7317ceda-df6f-4826-aa1a-15304c2b0fcd" volumeName="kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805435 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805455 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805479 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805498 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbcb0196-be5c-44a4-9749-5df9fbeaa718" volumeName="kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805516 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="401bbef2-684c-4f55-b2c7-e6184c789e40" volumeName="kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805533 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" volumeName="kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805566 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805591 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805618 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805642 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805665 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e302bc0b-7560-4f84-813f-d966c2dbe47c" volumeName="kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805687 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805711 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1da0c222-4b59-424f-9817-48673083df00" volumeName="kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805732 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d01c21a1-6c2c-49a7-9d85-254662851838" volumeName="kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805752 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e237ed52-5561-44c5-bcb1-de62691d6431" volumeName="kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805772 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78057cd-5120-4a12-934d-9fed51e1bdc0" volumeName="kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805790 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a829558-a672-4dc5-ae20-69884213482f" volumeName="kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805809 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" volumeName="kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805826 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2548aca-4a9d-4670-a60a-0d6361d1c441" volumeName="kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805845 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac523956-c8a3-4794-a1fa-660cd14966bb" volumeName="kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805867 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805884 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805901 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e52cbdc-1d46-4cc9-85ee-535aa449992f" volumeName="kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805919 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805936 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805952 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7097f64-1709-4f76-a725-5a6c6cc5919b" volumeName="kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805971 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db164b32-e20e-4d07-a9ae-98720321621d" volumeName="kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.805993 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7097f64-1709-4f76-a725-5a6c6cc5919b" volumeName="kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806014 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" volumeName="kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806033 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a229b84-65bd-493b-90dd-b8194f842dc8" volumeName="kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806050 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" volumeName="kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806067 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2548aca-4a9d-4670-a60a-0d6361d1c441" volumeName="kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806113 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806130 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806147 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d2e1686-3a30-4021-9c03-02e472bc6ff3" volumeName="kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806164 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" volumeName="kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806181 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d770808-d390-41c1-a9d9-fc12b99fa9a9" volumeName="kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806240 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d70f4efb-e61a-4e88-a271-2f4af21ecdf3" volumeName="kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806261 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806280 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806297 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d2e1686-3a30-4021-9c03-02e472bc6ff3" volumeName="kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806317 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806335 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cf5a2ef-2498-40a0-a189-0753076fd3b6" volumeName="kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806354 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e237ed52-5561-44c5-bcb1-de62691d6431" volumeName="kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806375 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0" volumeName="kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806418 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56e11e7e-6946-4e11-bce9-e91a721fe4a7" volumeName="kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806439 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a229b84-65bd-493b-90dd-b8194f842dc8" volumeName="kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806459 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806478 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806497 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e302bc0b-7560-4f84-813f-d966c2dbe47c" volumeName="kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806515 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806532 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="460f09d8-a143-48d2-9db0-be247386984a" volumeName="kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806632 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" volumeName="kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806653 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70892c23-554d-466c-a526-90a799439fe0" volumeName="kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806672 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806691 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806707 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1da0c222-4b59-424f-9817-48673083df00" volumeName="kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806723 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" volumeName="kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806741 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" volumeName="kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806758 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806779 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806796 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d01c21a1-6c2c-49a7-9d85-254662851838" volumeName="kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806812 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78057cd-5120-4a12-934d-9fed51e1bdc0" volumeName="kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806829 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="401bbef2-684c-4f55-b2c7-e6184c789e40" volumeName="kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806842 23041 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806847 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806881 23041 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806882 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806910 23041 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806905 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d770808-d390-41c1-a9d9-fc12b99fa9a9" volumeName="kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: E0308 00:31:34.806967 23041 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.806983 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8" volumeName="kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807007 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7097f64-1709-4f76-a725-5a6c6cc5919b" volumeName="kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807027 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d0cb126-341c-4215-ad2e-a008193cc0b5" volumeName="kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807052 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ad37f40-c533-4a1e-882a-2e0973eff86d" volumeName="kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807073 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807092 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807111 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbcb0196-be5c-44a4-9749-5df9fbeaa718" volumeName="kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807129 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1abf904b-0b8d-4d61-8231-0e8d00933192" volumeName="kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807148 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" volumeName="kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807171 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70892c23-554d-466c-a526-90a799439fe0" volumeName="kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807189 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d70f4efb-e61a-4e88-a271-2f4af21ecdf3" volumeName="kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807283 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58333089-2456-4a25-8ba7-6d557eefa177" volumeName="kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807305 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d70f4efb-e61a-4e88-a271-2f4af21ecdf3" volumeName="kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807324 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e237ed52-5561-44c5-bcb1-de62691d6431" volumeName="kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807343 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6999cf38-e317-4727-98c9-d4e348e9e16a" volumeName="kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807361 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.808831 23041 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.807380 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78057cd-5120-4a12-934d-9fed51e1bdc0" volumeName="kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809016 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809039 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" volumeName="kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809059 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c8d406-5448-4056-ab3c-c8399217c024" volumeName="kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809080 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" volumeName="kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809099 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809147 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809183 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" volumeName="kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809216 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae061e84-5e6a-415c-a735-fa14add7318a" volumeName="kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809236 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e" volumeName="kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809250 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" volumeName="kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809265 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809280 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d01c21a1-6c2c-49a7-9d85-254662851838" volumeName="kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809294 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d01c21a1-6c2c-49a7-9d85-254662851838" volumeName="kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809311 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3f42081-387d-4798-b981-ac232e851bb4" volumeName="kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809325 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b1a69b5-c946-495d-ae02-c56f788279e8" volumeName="kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809339 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d770808-d390-41c1-a9d9-fc12b99fa9a9" volumeName="kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809354 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809416 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" volumeName="kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809432 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0" volumeName="kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809448 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531e9339-968c-47bf-b8ea-c44d9ceef4b3" volumeName="kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809466 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b9f4db1-3ba9-49a5-9a65-1d770ee59a65" volumeName="kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809481 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" volumeName="kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809498 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84522c03-fd7b-4be7-9413-84e510b9dc5a" volumeName="kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809512 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03f4bafb-c270-428a-bacf-8a424b3d1a05" volumeName="kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809528 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03f4bafb-c270-428a-bacf-8a424b3d1a05" volumeName="kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809542 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" volumeName="kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809567 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b94acad3-cf4e-443d-80fb-5e68a4074336" volumeName="kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809584 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e76bc134-2a88-4f92-9aa7-f6854941b98f" volumeName="kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809599 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e97435ee-522e-427d-9efc-40bc3d2b0d02" volumeName="kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809616 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809631 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0" volumeName="kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809650 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="460f09d8-a143-48d2-9db0-be247386984a" volumeName="kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809667 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0f496486-70d5-4c5c-b4f3-6cc19427762f" volumeName="kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809682 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809696 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e237ed52-5561-44c5-bcb1-de62691d6431" volumeName="kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809711 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7" volumeName="kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809725 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2ce2ea7-bd25-4294-8f3a-11ce53577830" volumeName="kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809741 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef0a3c84-98bb-4915-9010-d66fcbeafe09" volumeName="kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809754 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c8d406-5448-4056-ab3c-c8399217c024" volumeName="kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809767 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b9f4db1-3ba9-49a5-9a65-1d770ee59a65" volumeName="kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809782 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7da68e85-9170-499d-8050-139ecfac4600" volumeName="kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809797 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f5539c1-fb87-42d6-b735-6de53421bb6b" volumeName="kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809811 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b9f4db1-3ba9-49a5-9a65-1d770ee59a65" volumeName="kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809826 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70892c23-554d-466c-a526-90a799439fe0" volumeName="kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809839 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" volumeName="kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809854 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1751db13-b792-43e2-8459-d1d4a0164dfb" volumeName="kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809869 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" volumeName="kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809882 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2ac55f03-dd6f-4ead-bacc-c69aeca146dc" volumeName="kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809897 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d2e1686-3a30-4021-9c03-02e472bc6ff3" volumeName="kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809910 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" volumeName="kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809923 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a68ad726-392e-4a7a-a384-409108df9c8b" volumeName="kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809937 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af391724-079a-4bac-a89e-978ffd471763" volumeName="kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809949 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e302bc0b-7560-4f84-813f-d966c2dbe47c" volumeName="kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809965 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3fee96d7-75a7-46e4-9707-7bd292f10b84" volumeName="kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809979 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" volumeName="kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs" seLinuxMountContext="" Mar 08 00:31:34.809779 master-0 kubenswrapper[23041]: I0308 00:31:34.809995 23041 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d810f7f-258a-47ce-9f99-7b1d93388aee" volumeName="kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images" seLinuxMountContext="" Mar 08 00:31:34.817830 master-0 kubenswrapper[23041]: I0308 00:31:34.810009 23041 reconstruct.go:97] "Volume reconstruction finished" Mar 08 00:31:34.817830 master-0 kubenswrapper[23041]: I0308 00:31:34.810019 23041 reconciler.go:26] "Reconciler: start to sync state" Mar 08 00:31:34.823545 master-0 kubenswrapper[23041]: I0308 00:31:34.823491 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-49hzm_ef0a3c84-98bb-4915-9010-d66fcbeafe09/openshift-controller-manager-operator/1.log" Mar 08 00:31:34.823611 master-0 kubenswrapper[23041]: I0308 00:31:34.823565 23041 generic.go:334] "Generic (PLEG): container finished" podID="ef0a3c84-98bb-4915-9010-d66fcbeafe09" containerID="5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569" exitCode=255 Mar 08 00:31:34.827483 master-0 kubenswrapper[23041]: I0308 00:31:34.827439 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-blw5x_4d0b9fbc-a1f8-4a98-99de-758734bd1a5b/ingress-operator/2.log" Mar 08 00:31:34.827781 master-0 kubenswrapper[23041]: I0308 00:31:34.827740 23041 generic.go:334] "Generic (PLEG): container finished" podID="4d0b9fbc-a1f8-4a98-99de-758734bd1a5b" containerID="9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94" exitCode=1 Mar 08 00:31:34.843423 master-0 kubenswrapper[23041]: I0308 00:31:34.843335 23041 generic.go:334] "Generic (PLEG): container finished" podID="56e11e7e-6946-4e11-bce9-e91a721fe4a7" containerID="819bab5050551748fadc71568c0e7c229f38c2b2cb38e16a3bd09395d5299f4e" exitCode=0 Mar 08 00:31:34.843423 master-0 kubenswrapper[23041]: I0308 00:31:34.843403 23041 generic.go:334] "Generic (PLEG): container finished" podID="56e11e7e-6946-4e11-bce9-e91a721fe4a7" containerID="cceb2895b7ad1a9aea1a615553362ea80d4700a89b8411dc29278d45b0d40f09" exitCode=0 Mar 08 00:31:34.846714 master-0 kubenswrapper[23041]: I0308 00:31:34.846682 23041 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2" exitCode=0 Mar 08 00:31:34.846714 master-0 kubenswrapper[23041]: I0308 00:31:34.846709 23041 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="09b799c18c45feaba6859a57b3c549da1772578d33ab2e69691bfdb4a7740bc3" exitCode=0 Mar 08 00:31:34.847653 master-0 kubenswrapper[23041]: I0308 00:31:34.846721 23041 generic.go:334] "Generic (PLEG): container finished" podID="db164b32-e20e-4d07-a9ae-98720321621d" containerID="1b42fcb0b0ae8c854969b1967188fb3b2c0ac7365173440cfbf5c3f93e5315cf" exitCode=0 Mar 08 00:31:34.853060 master-0 kubenswrapper[23041]: I0308 00:31:34.853011 23041 generic.go:334] "Generic (PLEG): container finished" podID="c2ce2ea7-bd25-4294-8f3a-11ce53577830" containerID="8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4" exitCode=0 Mar 08 00:31:34.856551 master-0 kubenswrapper[23041]: I0308 00:31:34.856234 23041 generic.go:334] "Generic (PLEG): container finished" podID="ac523956-c8a3-4794-a1fa-660cd14966bb" containerID="322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03" exitCode=0 Mar 08 00:31:34.861298 master-0 kubenswrapper[23041]: I0308 00:31:34.861256 23041 generic.go:334] "Generic (PLEG): container finished" podID="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" containerID="0238d925fb5b554e7f8df9102a9ba758748ba0abdd9b4e92ab97dadd2793a34a" exitCode=0 Mar 08 00:31:34.861298 master-0 kubenswrapper[23041]: I0308 00:31:34.861291 23041 generic.go:334] "Generic (PLEG): container finished" podID="a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b" containerID="04665dc4db4c2d82c8d11a97a36abe0b11fe894bbbd6e5c64a1b3a502d59c374" exitCode=0 Mar 08 00:31:34.863374 master-0 kubenswrapper[23041]: I0308 00:31:34.863338 23041 generic.go:334] "Generic (PLEG): container finished" podID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerID="915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" exitCode=0 Mar 08 00:31:34.866296 master-0 kubenswrapper[23041]: I0308 00:31:34.866241 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-st2sr_ec2d22f2-c260-42a6-a9da-ee0f44f42303/network-operator/1.log" Mar 08 00:31:34.866348 master-0 kubenswrapper[23041]: I0308 00:31:34.866317 23041 generic.go:334] "Generic (PLEG): container finished" podID="ec2d22f2-c260-42a6-a9da-ee0f44f42303" containerID="25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede" exitCode=255 Mar 08 00:31:34.871813 master-0 kubenswrapper[23041]: I0308 00:31:34.871762 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-27phk_2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/etcd-operator/1.log" Mar 08 00:31:34.871904 master-0 kubenswrapper[23041]: I0308 00:31:34.871829 23041 generic.go:334] "Generic (PLEG): container finished" podID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerID="d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922" exitCode=255 Mar 08 00:31:34.874473 master-0 kubenswrapper[23041]: I0308 00:31:34.874433 23041 generic.go:334] "Generic (PLEG): container finished" podID="66915251-1fdd-40f3-a59b-054776b214df" containerID="d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc" exitCode=0 Mar 08 00:31:34.879472 master-0 kubenswrapper[23041]: I0308 00:31:34.879395 23041 generic.go:334] "Generic (PLEG): container finished" podID="fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9" containerID="9caf746e34f3ceb9b7a0c15d058a8c3ef6549037b6840e762c5d26db1b3afa1f" exitCode=0 Mar 08 00:31:34.881893 master-0 kubenswrapper[23041]: I0308 00:31:34.881855 23041 generic.go:334] "Generic (PLEG): container finished" podID="e76bc134-2a88-4f92-9aa7-f6854941b98f" containerID="ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441" exitCode=0 Mar 08 00:31:34.885828 master-0 kubenswrapper[23041]: I0308 00:31:34.885794 23041 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 00:31:34.886758 master-0 kubenswrapper[23041]: I0308 00:31:34.886718 23041 generic.go:334] "Generic (PLEG): container finished" podID="614f0a0f-5853-4cf6-bd3d-174141f0f1e2" containerID="ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a" exitCode=0 Mar 08 00:31:34.888404 master-0 kubenswrapper[23041]: I0308 00:31:34.888353 23041 generic.go:334] "Generic (PLEG): container finished" podID="4217b755-ca87-45cf-9e52-7b2681660f41" containerID="6c847624822fb2ae11b6027b5155999eb848a04181b2d105ba183b9e9a68d9b4" exitCode=0 Mar 08 00:31:34.891584 master-0 kubenswrapper[23041]: I0308 00:31:34.891535 23041 generic.go:334] "Generic (PLEG): container finished" podID="21dd42b1-2628-4a24-97e7-6759888ed316" containerID="f70bb9a5f0e3f9b911feb28654c30e151d3e1fb5d9549e6e2016049387b17fb2" exitCode=0 Mar 08 00:31:34.905289 master-0 kubenswrapper[23041]: I0308 00:31:34.905168 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="620aae0686e0d0747f86c66dccb5f833f425852d851da5976e803bb0ce3011ba" exitCode=0 Mar 08 00:31:34.905289 master-0 kubenswrapper[23041]: I0308 00:31:34.905231 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="c8de3ced39581b8ad5acd40157b9e893206291d5fd34e7516c2c1b0358ea17a6" exitCode=0 Mar 08 00:31:34.905289 master-0 kubenswrapper[23041]: I0308 00:31:34.905245 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="182e67e6b82b83c4d47d4c01d3dcbdede2056c9bcdcf8367c8a6959d0eeac8ea" exitCode=0 Mar 08 00:31:34.907213 master-0 kubenswrapper[23041]: E0308 00:31:34.907160 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:34.908953 master-0 kubenswrapper[23041]: I0308 00:31:34.908911 23041 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" exitCode=1 Mar 08 00:31:34.916104 master-0 kubenswrapper[23041]: I0308 00:31:34.916030 23041 generic.go:334] "Generic (PLEG): container finished" podID="24ef1fb7-c8a1-4b50-b89f-2a81848ebb25" containerID="95c20172ebbb05524877a835e30132f4f70ded4813cb99373d344901a324181d" exitCode=0 Mar 08 00:31:34.924493 master-0 kubenswrapper[23041]: I0308 00:31:34.924454 23041 generic.go:334] "Generic (PLEG): container finished" podID="55c8d406-5448-4056-ab3c-c8399217c024" containerID="f1165833632b857988bef725397f89c163ab44ca5ba27c1f2f567224751fe8ad" exitCode=0 Mar 08 00:31:34.924493 master-0 kubenswrapper[23041]: I0308 00:31:34.924485 23041 generic.go:334] "Generic (PLEG): container finished" podID="55c8d406-5448-4056-ab3c-c8399217c024" containerID="5c5fe88ca84d34535298e53e21f41989f9811c3fb403419a0f79b41f340064f5" exitCode=0 Mar 08 00:31:34.926706 master-0 kubenswrapper[23041]: I0308 00:31:34.926665 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 00:31:34.927014 master-0 kubenswrapper[23041]: I0308 00:31:34.926978 23041 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47" exitCode=1 Mar 08 00:31:34.927014 master-0 kubenswrapper[23041]: I0308 00:31:34.927004 23041 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460" exitCode=0 Mar 08 00:31:34.932932 master-0 kubenswrapper[23041]: I0308 00:31:34.932885 23041 generic.go:334] "Generic (PLEG): container finished" podID="531e9339-968c-47bf-b8ea-c44d9ceef4b3" containerID="829e088d3beb6bbaa940412e9e43d8b3ba4f7b2b62947bd685d43db99e68005b" exitCode=0 Mar 08 00:31:34.937240 master-0 kubenswrapper[23041]: I0308 00:31:34.937193 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_55216a56-677a-4f28-a530-77d44bded8a2/installer/0.log" Mar 08 00:31:34.937314 master-0 kubenswrapper[23041]: I0308 00:31:34.937243 23041 generic.go:334] "Generic (PLEG): container finished" podID="55216a56-677a-4f28-a530-77d44bded8a2" containerID="1a0afc6f5f43ae0c03dad4b66580da08dbfc175218d88b6ca2b45fa8794895ad" exitCode=1 Mar 08 00:31:34.943725 master-0 kubenswrapper[23041]: I0308 00:31:34.943683 23041 generic.go:334] "Generic (PLEG): container finished" podID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerID="92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807" exitCode=0 Mar 08 00:31:34.947129 master-0 kubenswrapper[23041]: I0308 00:31:34.947092 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/0.log" Mar 08 00:31:34.947561 master-0 kubenswrapper[23041]: I0308 00:31:34.947514 23041 generic.go:334] "Generic (PLEG): container finished" podID="af391724-079a-4bac-a89e-978ffd471763" containerID="c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc" exitCode=1 Mar 08 00:31:34.958284 master-0 kubenswrapper[23041]: I0308 00:31:34.958245 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" exitCode=0 Mar 08 00:31:34.962347 master-0 kubenswrapper[23041]: I0308 00:31:34.962317 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="ee1bfab2130a9c72df8adc63c3382589fac2b085c9ce4752d92d10429ef61f76" exitCode=0 Mar 08 00:31:34.962347 master-0 kubenswrapper[23041]: I0308 00:31:34.962343 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="c7031bd4261187339ddcdbbf17642c8a944a5d40ae330e696f51959987e70da4" exitCode=0 Mar 08 00:31:34.962445 master-0 kubenswrapper[23041]: I0308 00:31:34.962354 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e" exitCode=0 Mar 08 00:31:34.962445 master-0 kubenswrapper[23041]: I0308 00:31:34.962363 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="7264af89c3bcf80c9a189b3bddcd203436764c691f9c5c52533e7f598dddfac4" exitCode=0 Mar 08 00:31:34.962445 master-0 kubenswrapper[23041]: I0308 00:31:34.962371 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="48c6a8c71ab87bd002a24ce7589e179bd20778d506e7cd037500b0c5771c655a" exitCode=0 Mar 08 00:31:34.962445 master-0 kubenswrapper[23041]: I0308 00:31:34.962378 23041 generic.go:334] "Generic (PLEG): container finished" podID="7ad8b9ea-ba1c-4507-9b70-ce2da170d480" containerID="f8e210245fcf5757a0858988b80936bb56e15ab6a7c3881f301f7f4cb8a8f550" exitCode=0 Mar 08 00:31:34.964722 master-0 kubenswrapper[23041]: I0308 00:31:34.964696 23041 generic.go:334] "Generic (PLEG): container finished" podID="365dc4ac-fbc8-4589-a799-8327b3ebd0a5" containerID="08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c" exitCode=0 Mar 08 00:31:34.972465 master-0 kubenswrapper[23041]: I0308 00:31:34.972082 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vnl28_2b1a69b5-c946-495d-ae02-c56f788279e8/openshift-config-operator/2.log" Mar 08 00:31:34.972704 master-0 kubenswrapper[23041]: I0308 00:31:34.972609 23041 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73" exitCode=255 Mar 08 00:31:34.972704 master-0 kubenswrapper[23041]: I0308 00:31:34.972647 23041 generic.go:334] "Generic (PLEG): container finished" podID="2b1a69b5-c946-495d-ae02-c56f788279e8" containerID="a8112b99efb51a20fdb91fac566b95eaf004df0ff11f9408140898bfa467ea7c" exitCode=0 Mar 08 00:31:34.975291 master-0 kubenswrapper[23041]: I0308 00:31:34.974729 23041 generic.go:334] "Generic (PLEG): container finished" podID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerID="d6af0d3578bc6ae0d4e0f5d4dbddc52dc70217cef15e030aab47b2704363ffe2" exitCode=0 Mar 08 00:31:34.985978 master-0 kubenswrapper[23041]: I0308 00:31:34.985911 23041 generic.go:334] "Generic (PLEG): container finished" podID="3fee96d7-75a7-46e4-9707-7bd292f10b84" containerID="52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379" exitCode=0 Mar 08 00:31:34.991792 master-0 kubenswrapper[23041]: I0308 00:31:34.991721 23041 generic.go:334] "Generic (PLEG): container finished" podID="58333089-2456-4a25-8ba7-6d557eefa177" containerID="dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d" exitCode=0 Mar 08 00:31:34.994042 master-0 kubenswrapper[23041]: I0308 00:31:34.993984 23041 generic.go:334] "Generic (PLEG): container finished" podID="b2548aca-4a9d-4670-a60a-0d6361d1c441" containerID="031c64f86b4914d8ed85469cff79e56b7a2e1cbd518e0fd70f47211192095f45" exitCode=0 Mar 08 00:31:34.994042 master-0 kubenswrapper[23041]: I0308 00:31:34.994024 23041 generic.go:334] "Generic (PLEG): container finished" podID="b2548aca-4a9d-4670-a60a-0d6361d1c441" containerID="fe58071840dc6349204161e59ca64944f26b1ff66582767c1106a706a17472e1" exitCode=0 Mar 08 00:31:34.996944 master-0 kubenswrapper[23041]: I0308 00:31:34.996897 23041 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerID="04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3" exitCode=0 Mar 08 00:31:34.999076 master-0 kubenswrapper[23041]: I0308 00:31:34.999032 23041 generic.go:334] "Generic (PLEG): container finished" podID="1751db13-b792-43e2-8459-d1d4a0164dfb" containerID="8e5eb8c3a997190fe55fe0f74af3ee5e0a5480af9438a723ead360bc861186ec" exitCode=0 Mar 08 00:31:35.004341 master-0 kubenswrapper[23041]: I0308 00:31:35.004270 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58" exitCode=0 Mar 08 00:31:35.013970 master-0 kubenswrapper[23041]: I0308 00:31:35.013942 23041 generic.go:334] "Generic (PLEG): container finished" podID="3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab" containerID="459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00" exitCode=0 Mar 08 00:31:35.016470 master-0 kubenswrapper[23041]: I0308 00:31:35.016448 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/0.log" Mar 08 00:31:35.016709 master-0 kubenswrapper[23041]: I0308 00:31:35.016685 23041 generic.go:334] "Generic (PLEG): container finished" podID="d01c21a1-6c2c-49a7-9d85-254662851838" containerID="f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7" exitCode=1 Mar 08 00:31:35.018164 master-0 kubenswrapper[23041]: I0308 00:31:35.018140 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/1.log" Mar 08 00:31:35.018248 master-0 kubenswrapper[23041]: I0308 00:31:35.018164 23041 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266" exitCode=1 Mar 08 00:31:35.026033 master-0 kubenswrapper[23041]: I0308 00:31:35.025996 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/0.log" Mar 08 00:31:35.026033 master-0 kubenswrapper[23041]: I0308 00:31:35.026027 23041 generic.go:334] "Generic (PLEG): container finished" podID="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" containerID="1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869" exitCode=1 Mar 08 00:31:35.030684 master-0 kubenswrapper[23041]: I0308 00:31:35.030645 23041 generic.go:334] "Generic (PLEG): container finished" podID="b100ce12-965e-409e-8cdb-8f99ef51a82b" containerID="5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848" exitCode=0 Mar 08 00:31:35.107369 master-0 kubenswrapper[23041]: E0308 00:31:35.107297 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:35.508538 master-0 kubenswrapper[23041]: E0308 00:31:35.508466 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:35.742736 master-0 kubenswrapper[23041]: I0308 00:31:35.742654 23041 apiserver.go:52] "Watching apiserver" Mar 08 00:31:35.768671 master-0 kubenswrapper[23041]: I0308 00:31:35.768555 23041 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:31:36.311380 master-0 kubenswrapper[23041]: E0308 00:31:36.308979 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:37.909380 master-0 kubenswrapper[23041]: E0308 00:31:37.909307 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:41.110172 master-0 kubenswrapper[23041]: E0308 00:31:41.110083 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:46.111198 master-0 kubenswrapper[23041]: E0308 00:31:46.111134 23041 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 00:31:50.539315 master-0 kubenswrapper[23041]: E0308 00:31:50.539180 23041 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/system.slice/crio.service\": failed to get container info for \"/system.slice/crio.service\": unknown container \"/system.slice/crio.service\"" containerName="/system.slice/crio.service" Mar 08 00:31:50.541307 master-0 kubenswrapper[23041]: E0308 00:31:50.540535 23041 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/system.slice\": failed to get container info for \"/system.slice\": unknown container \"/system.slice\"" containerName="/system.slice" Mar 08 00:31:50.541307 master-0 kubenswrapper[23041]: E0308 00:31:50.541079 23041 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/kubepods.slice\": failed to get container info for \"/kubepods.slice\": unknown container \"/kubepods.slice\"" containerName="/kubepods.slice" Mar 08 00:31:50.887295 master-0 kubenswrapper[23041]: I0308 00:31:50.884694 23041 manager.go:324] Recovery completed Mar 08 00:31:50.971975 master-0 kubenswrapper[23041]: I0308 00:31:50.971924 23041 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 00:31:50.971975 master-0 kubenswrapper[23041]: I0308 00:31:50.971951 23041 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 00:31:50.971975 master-0 kubenswrapper[23041]: I0308 00:31:50.971980 23041 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:31:50.972265 master-0 kubenswrapper[23041]: I0308 00:31:50.972159 23041 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 00:31:50.972265 master-0 kubenswrapper[23041]: I0308 00:31:50.972170 23041 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 00:31:50.972265 master-0 kubenswrapper[23041]: I0308 00:31:50.972195 23041 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 00:31:50.972265 master-0 kubenswrapper[23041]: I0308 00:31:50.972217 23041 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 00:31:50.972265 master-0 kubenswrapper[23041]: I0308 00:31:50.972225 23041 policy_none.go:49] "None policy: Start" Mar 08 00:31:50.977045 master-0 kubenswrapper[23041]: I0308 00:31:50.976642 23041 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 00:31:50.977045 master-0 kubenswrapper[23041]: I0308 00:31:50.976692 23041 state_mem.go:35] "Initializing new in-memory state store" Mar 08 00:31:50.977045 master-0 kubenswrapper[23041]: I0308 00:31:50.976921 23041 state_mem.go:75] "Updated machine memory state" Mar 08 00:31:50.977045 master-0 kubenswrapper[23041]: I0308 00:31:50.976930 23041 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 00:31:50.990426 master-0 kubenswrapper[23041]: I0308 00:31:50.990398 23041 manager.go:334] "Starting Device Plugin manager" Mar 08 00:31:50.990547 master-0 kubenswrapper[23041]: I0308 00:31:50.990472 23041 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 00:31:50.990547 master-0 kubenswrapper[23041]: I0308 00:31:50.990486 23041 server.go:79] "Starting device plugin registration server" Mar 08 00:31:50.990957 master-0 kubenswrapper[23041]: I0308 00:31:50.990919 23041 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 00:31:50.991005 master-0 kubenswrapper[23041]: I0308 00:31:50.990941 23041 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 00:31:50.991219 master-0 kubenswrapper[23041]: I0308 00:31:50.991081 23041 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 00:31:50.991280 master-0 kubenswrapper[23041]: I0308 00:31:50.991224 23041 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 00:31:50.991280 master-0 kubenswrapper[23041]: I0308 00:31:50.991236 23041 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 00:31:51.091784 master-0 kubenswrapper[23041]: I0308 00:31:51.091545 23041 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:31:51.094119 master-0 kubenswrapper[23041]: I0308 00:31:51.094024 23041 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 00:31:51.094119 master-0 kubenswrapper[23041]: I0308 00:31:51.094072 23041 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 00:31:51.094119 master-0 kubenswrapper[23041]: I0308 00:31:51.094081 23041 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 00:31:51.094359 master-0 kubenswrapper[23041]: I0308 00:31:51.094234 23041 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 00:31:51.112458 master-0 kubenswrapper[23041]: I0308 00:31:51.112363 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","kube-system/bootstrap-kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:31:51.113903 master-0 kubenswrapper[23041]: I0308 00:31:51.113306 23041 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 00:31:51.113903 master-0 kubenswrapper[23041]: I0308 00:31:51.113427 23041 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 00:31:51.113903 master-0 kubenswrapper[23041]: I0308 00:31:51.113735 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq","openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj","openshift-marketplace/community-operators-6t5lg","openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2","openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-server-wkt98","openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886","openshift-multus/network-metrics-daemon-krv7c","openshift-config-operator/openshift-config-operator-64488f9d78-vnl28","openshift-ingress-operator/ingress-operator-677db989d6-blw5x","openshift-kube-controller-manager/installer-2-master-0","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh","openshift-service-ca/service-ca-84bfdbbb7f-bc2m2","openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj","openshift-ingress/router-default-79f8cd6fdd-r6nkv","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx","openshift-marketplace/certified-operators-9nqqp","openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9","openshift-network-operator/iptables-alerter-rfnqf","kube-system/bootstrap-kube-controller-manager-master-0","openshift-etcd/installer-1-master-0","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2","openshift-multus/multus-additional-cni-plugins-d5jxb","openshift-network-diagnostics/network-check-target-w5fjg","openshift-network-node-identity/network-node-identity-m7549","openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w","openshift-kube-scheduler/installer-5-master-0","openshift-machine-config-operator/machine-config-daemon-k7pnc","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s","openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v","openshift-monitoring/node-exporter-bx9dn","openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs","openshift-dns/dns-default-jfjzg","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk","openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4","openshift-etcd-operator/etcd-operator-5884b9cd56-27phk","openshift-etcd/etcd-master-0","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst","openshift-network-operator/network-operator-7c649bf6d4-st2sr","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks","openshift-dns/node-resolver-l9pkr","openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm","openshift-dns-operator/dns-operator-589895fbb7-gmvnl","openshift-insights/insights-operator-8f89dfddd-brq9l","openshift-marketplace/redhat-operators-9j9zs","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q","openshift-multus/multus-dllkj","openshift-ovn-kubernetes/ovnkube-node-2w9mf","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-monitoring/metrics-server-6474759988-dnw4m","openshift-multus/multus-admission-controller-7769569c45-5n69x","openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9","openshift-apiserver/apiserver-85cb8cb9bb-bmx44","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj","openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv","assisted-installer/assisted-installer-controller-v949k","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk","openshift-marketplace/redhat-marketplace-4fjw9","openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4","openshift-cluster-node-tuning-operator/tuned-67jx5","openshift-kube-apiserver/installer-1-master-0"] Mar 08 00:31:51.114129 master-0 kubenswrapper[23041]: I0308 00:31:51.113990 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-v949k" Mar 08 00:31:51.126432 master-0 kubenswrapper[23041]: I0308 00:31:51.126387 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.126763 master-0 kubenswrapper[23041]: I0308 00:31:51.126739 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:31:51.127074 master-0 kubenswrapper[23041]: I0308 00:31:51.127047 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:31:51.130158 master-0 kubenswrapper[23041]: I0308 00:31:51.130124 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.133180 master-0 kubenswrapper[23041]: I0308 00:31:51.132067 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:31:51.133180 master-0 kubenswrapper[23041]: I0308 00:31:51.132436 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:31:51.133180 master-0 kubenswrapper[23041]: I0308 00:31:51.132474 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:31:51.133180 master-0 kubenswrapper[23041]: I0308 00:31:51.132735 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:31:51.137861 master-0 kubenswrapper[23041]: I0308 00:31:51.137765 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:31:51.137861 master-0 kubenswrapper[23041]: I0308 00:31:51.137835 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:31:51.138061 master-0 kubenswrapper[23041]: I0308 00:31:51.137959 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.138256 master-0 kubenswrapper[23041]: I0308 00:31:51.138214 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:31:51.138318 master-0 kubenswrapper[23041]: I0308 00:31:51.138266 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:31:51.138318 master-0 kubenswrapper[23041]: I0308 00:31:51.138312 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.138490 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.138633 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.138737 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.138937 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.138995 23041 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="8b087322-b76a-4293-8e6b-786c5f01f37f" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.139140 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:31:51.139535 master-0 kubenswrapper[23041]: I0308 00:31:51.139278 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140144 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140443 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140553 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140682 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140803 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.140925 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.141114 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:31:51.141600 master-0 kubenswrapper[23041]: I0308 00:31:51.141278 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:31:51.141923 master-0 kubenswrapper[23041]: I0308 00:31:51.141786 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:31:51.141923 master-0 kubenswrapper[23041]: E0308 00:31:51.141892 23041 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.141999 master-0 kubenswrapper[23041]: E0308 00:31:51.141992 23041 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.142342 master-0 kubenswrapper[23041]: I0308 00:31:51.142294 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:31:51.142423 master-0 kubenswrapper[23041]: I0308 00:31:51.142404 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:31:51.142519 master-0 kubenswrapper[23041]: E0308 00:31:51.142482 23041 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.142594 master-0 kubenswrapper[23041]: I0308 00:31:51.142489 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.142594 master-0 kubenswrapper[23041]: I0308 00:31:51.142579 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:31:51.142684 master-0 kubenswrapper[23041]: I0308 00:31:51.142499 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:31:51.142739 master-0 kubenswrapper[23041]: I0308 00:31:51.142720 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:31:51.142787 master-0 kubenswrapper[23041]: I0308 00:31:51.142509 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.142827 master-0 kubenswrapper[23041]: I0308 00:31:51.142806 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:31:51.142862 master-0 kubenswrapper[23041]: I0308 00:31:51.142520 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:31:51.142862 master-0 kubenswrapper[23041]: I0308 00:31:51.142859 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.142938 master-0 kubenswrapper[23041]: I0308 00:31:51.142531 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.142938 master-0 kubenswrapper[23041]: I0308 00:31:51.142541 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:31:51.143011 master-0 kubenswrapper[23041]: I0308 00:31:51.142963 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 00:31:51.143081 master-0 kubenswrapper[23041]: I0308 00:31:51.143060 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:31:51.143140 master-0 kubenswrapper[23041]: I0308 00:31:51.143093 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:31:51.143140 master-0 kubenswrapper[23041]: I0308 00:31:51.143103 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:31:51.143361 master-0 kubenswrapper[23041]: I0308 00:31:51.143342 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:31:51.144552 master-0 kubenswrapper[23041]: I0308 00:31:51.144518 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 00:31:51.144753 master-0 kubenswrapper[23041]: I0308 00:31:51.144713 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 00:31:51.144803 master-0 kubenswrapper[23041]: I0308 00:31:51.144794 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:31:51.144935 master-0 kubenswrapper[23041]: I0308 00:31:51.144910 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 00:31:51.145533 master-0 kubenswrapper[23041]: I0308 00:31:51.145507 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:31:51.146616 master-0 kubenswrapper[23041]: I0308 00:31:51.146503 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:31:51.150371 master-0 kubenswrapper[23041]: I0308 00:31:51.150305 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:31:51.152112 master-0 kubenswrapper[23041]: I0308 00:31:51.152090 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:31:51.152430 master-0 kubenswrapper[23041]: I0308 00:31:51.152408 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.152546 master-0 kubenswrapper[23041]: I0308 00:31:51.152533 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 00:31:51.152667 master-0 kubenswrapper[23041]: I0308 00:31:51.152533 23041 generic.go:334] "Generic (PLEG): container finished" podID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerID="1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d" exitCode=0 Mar 08 00:31:51.152749 master-0 kubenswrapper[23041]: I0308 00:31:51.152567 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.152914 master-0 kubenswrapper[23041]: I0308 00:31:51.152883 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:31:51.153018 master-0 kubenswrapper[23041]: I0308 00:31:51.152758 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:31:51.154299 master-0 kubenswrapper[23041]: I0308 00:31:51.153271 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:31:51.154628 master-0 kubenswrapper[23041]: I0308 00:31:51.153319 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:31:51.154808 master-0 kubenswrapper[23041]: I0308 00:31:51.153584 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:31:51.157035 master-0 kubenswrapper[23041]: I0308 00:31:51.157001 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:31:51.157158 master-0 kubenswrapper[23041]: I0308 00:31:51.157131 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:31:51.157241 master-0 kubenswrapper[23041]: I0308 00:31:51.157184 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:31:51.157297 master-0 kubenswrapper[23041]: I0308 00:31:51.157189 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 00:31:51.157347 master-0 kubenswrapper[23041]: I0308 00:31:51.157297 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:31:51.157347 master-0 kubenswrapper[23041]: I0308 00:31:51.157310 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:31:51.157427 master-0 kubenswrapper[23041]: I0308 00:31:51.157365 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:31:51.159276 master-0 kubenswrapper[23041]: I0308 00:31:51.159251 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:31:51.160540 master-0 kubenswrapper[23041]: I0308 00:31:51.160522 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:31:51.160902 master-0 kubenswrapper[23041]: I0308 00:31:51.160880 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:31:51.161282 master-0 kubenswrapper[23041]: I0308 00:31:51.161249 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:31:51.161525 master-0 kubenswrapper[23041]: I0308 00:31:51.161508 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 00:31:51.161662 master-0 kubenswrapper[23041]: I0308 00:31:51.161648 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:31:51.161870 master-0 kubenswrapper[23041]: I0308 00:31:51.161850 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 00:31:51.164346 master-0 kubenswrapper[23041]: I0308 00:31:51.164326 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.164586 master-0 kubenswrapper[23041]: I0308 00:31:51.164557 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:31:51.164869 master-0 kubenswrapper[23041]: I0308 00:31:51.164840 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 00:31:51.165036 master-0 kubenswrapper[23041]: I0308 00:31:51.165004 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 00:31:51.165414 master-0 kubenswrapper[23041]: I0308 00:31:51.165394 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 00:31:51.166163 master-0 kubenswrapper[23041]: I0308 00:31:51.165943 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:31:51.166355 master-0 kubenswrapper[23041]: I0308 00:31:51.166330 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 00:31:51.168428 master-0 kubenswrapper[23041]: I0308 00:31:51.168387 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:31:51.168657 master-0 kubenswrapper[23041]: I0308 00:31:51.168620 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 00:31:51.169361 master-0 kubenswrapper[23041]: I0308 00:31:51.169319 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:31:51.169361 master-0 kubenswrapper[23041]: I0308 00:31:51.169340 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 00:31:51.169676 master-0 kubenswrapper[23041]: I0308 00:31:51.169657 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:31:51.171188 master-0 kubenswrapper[23041]: I0308 00:31:51.170927 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:31:51.173277 master-0 kubenswrapper[23041]: I0308 00:31:51.173193 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.177554 master-0 kubenswrapper[23041]: I0308 00:31:51.177123 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 00:31:51.180242 master-0 kubenswrapper[23041]: I0308 00:31:51.179992 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:31:51.180242 master-0 kubenswrapper[23041]: I0308 00:31:51.180066 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:31:51.183620 master-0 kubenswrapper[23041]: I0308 00:31:51.183569 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5qffz"] Mar 08 00:31:51.184220 master-0 kubenswrapper[23041]: E0308 00:31:51.184168 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:31:51.184220 master-0 kubenswrapper[23041]: I0308 00:31:51.184200 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: E0308 00:31:51.184238 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: I0308 00:31:51.184246 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: E0308 00:31:51.184263 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: I0308 00:31:51.184269 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: E0308 00:31:51.184277 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:31:51.184326 master-0 kubenswrapper[23041]: I0308 00:31:51.184284 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:31:51.184539 master-0 kubenswrapper[23041]: I0308 00:31:51.184421 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cab26a-fe31-4cf2-a938-b280f1934d99" containerName="assisted-installer-controller" Mar 08 00:31:51.184539 master-0 kubenswrapper[23041]: I0308 00:31:51.184472 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="55216a56-677a-4f28-a530-77d44bded8a2" containerName="installer" Mar 08 00:31:51.184539 master-0 kubenswrapper[23041]: I0308 00:31:51.184487 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dd42b1-2628-4a24-97e7-6759888ed316" containerName="installer" Mar 08 00:31:51.184539 master-0 kubenswrapper[23041]: I0308 00:31:51.184499 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4217b755-ca87-45cf-9e52-7b2681660f41" containerName="installer" Mar 08 00:31:51.184905 master-0 kubenswrapper[23041]: I0308 00:31:51.184872 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5qffz"] Mar 08 00:31:51.184953 master-0 kubenswrapper[23041]: I0308 00:31:51.184898 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"e48e7bed76a9d2cfdf09898508b2c13d610c4aac80f76a7b83dcca91233aa06a"} Mar 08 00:31:51.184953 master-0 kubenswrapper[23041]: I0308 00:31:51.184950 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-db7d8"] Mar 08 00:31:51.185030 master-0 kubenswrapper[23041]: I0308 00:31:51.184997 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:51.185911 master-0 kubenswrapper[23041]: I0308 00:31:51.185841 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:31:51.186122 master-0 kubenswrapper[23041]: I0308 00:31:51.186078 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerDied","Data":"5aac2b21c945fd8c5f04ccb41b60633f9bb7e3c9d3e901a7648d97792b4bc569"} Mar 08 00:31:51.186178 master-0 kubenswrapper[23041]: I0308 00:31:51.186120 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" event={"ID":"ef0a3c84-98bb-4915-9010-d66fcbeafe09","Type":"ContainerStarted","Data":"3c8994f66c1270da68fac1ff2499afd806b950d0568c9f85327b0714473db68c"} Mar 08 00:31:51.186178 master-0 kubenswrapper[23041]: I0308 00:31:51.186142 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-db7d8"] Mar 08 00:31:51.186178 master-0 kubenswrapper[23041]: I0308 00:31:51.186160 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerStarted","Data":"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd"} Mar 08 00:31:51.186178 master-0 kubenswrapper[23041]: I0308 00:31:51.186176 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b45f5889c-z48tj"] Mar 08 00:31:51.186365 master-0 kubenswrapper[23041]: I0308 00:31:51.186191 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:31:51.186411 master-0 kubenswrapper[23041]: I0308 00:31:51.186390 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:31:51.186525 master-0 kubenswrapper[23041]: I0308 00:31:51.186497 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.186857 master-0 kubenswrapper[23041]: I0308 00:31:51.186826 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:31:51.186857 master-0 kubenswrapper[23041]: I0308 00:31:51.186857 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b45f5889c-z48tj"] Mar 08 00:31:51.186961 master-0 kubenswrapper[23041]: I0308 00:31:51.186869 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerStarted","Data":"1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00"} Mar 08 00:31:51.186961 master-0 kubenswrapper[23041]: I0308 00:31:51.186883 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"acd8ffd92596b76a588170b227f4d7ab4a872868344965430ac8c8d78ec037e1"} Mar 08 00:31:51.186961 master-0 kubenswrapper[23041]: I0308 00:31:51.186898 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6cfc594d97-x62fk"] Mar 08 00:31:51.187080 master-0 kubenswrapper[23041]: I0308 00:31:51.187055 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.187162 master-0 kubenswrapper[23041]: E0308 00:31:51.187096 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit-log client-ca-bundle configmap-kubelet-serving-ca-bundle kube-api-access-b66xq metrics-server-audit-profiles secret-metrics-client-certs secret-metrics-server-tls], unattached volumes=[], failed to process volumes=[audit-log client-ca-bundle configmap-kubelet-serving-ca-bundle kube-api-access-b66xq metrics-server-audit-profiles secret-metrics-client-certs secret-metrics-server-tls]: context canceled" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" Mar 08 00:31:51.189596 master-0 kubenswrapper[23041]: I0308 00:31:51.189543 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:31:51.190051 master-0 kubenswrapper[23041]: I0308 00:31:51.190001 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:31:51.190183 master-0 kubenswrapper[23041]: I0308 00:31:51.190153 23041 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 00:31:51.190370 master-0 kubenswrapper[23041]: I0308 00:31:51.190326 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerDied","Data":"9d40712043dab52958ea0afce9459c44f1ac9aa0390229d73de4eebe33434e94"} Mar 08 00:31:51.190424 master-0 kubenswrapper[23041]: I0308 00:31:51.190376 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6cfc594d97-x62fk"] Mar 08 00:31:51.190424 master-0 kubenswrapper[23041]: I0308 00:31:51.190396 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k"] Mar 08 00:31:51.190950 master-0 kubenswrapper[23041]: I0308 00:31:51.190485 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.190950 master-0 kubenswrapper[23041]: I0308 00:31:51.190622 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 00:31:51.191091 master-0 kubenswrapper[23041]: I0308 00:31:51.191064 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k"] Mar 08 00:31:51.191091 master-0 kubenswrapper[23041]: I0308 00:31:51.191086 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"2c10faa546580f627c778e91e9b7663017d55077528cad866312878aae39b47a"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191100 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191115 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" event={"ID":"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b","Type":"ContainerStarted","Data":"9da3ea5c4393051eef91cb7af969405949bc3c6b97f5782d6bc10af29a80c30d"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191125 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"34ce99c1480780527cadfa670226036ef9c17ba4caf6288b67da10db8e7da68e"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191137 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"d3f16b3080bd84cd315c0103c50c0e4fe4f94ba52854cacca3c3dd9366155a93"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191145 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"8ff474153830a652e4ddb7aadf249d8bcfad8aa4e41fc72213e841bb0817ffeb"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191153 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" event={"ID":"6999cf38-e317-4727-98c9-d4e348e9e16a","Type":"ContainerStarted","Data":"4b93ca0ef506b0c02846ca33f17d63f5a824052f00f7d19371fbf7e2b8abc456"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191163 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" event={"ID":"6999cf38-e317-4727-98c9-d4e348e9e16a","Type":"ContainerStarted","Data":"11fc2d0ea92ac8231758b019e771de66de17673da31d79a4aab6fc0b796373e6"} Mar 08 00:31:51.191173 master-0 kubenswrapper[23041]: I0308 00:31:51.191171 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dllkj" event={"ID":"7da68e85-9170-499d-8050-139ecfac4600","Type":"ContainerStarted","Data":"8c4dfda663d3108e0d4d75d8ea37376292f3986c7575fe504d33fabc4e8a91ef"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191223 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dllkj" event={"ID":"7da68e85-9170-499d-8050-139ecfac4600","Type":"ContainerStarted","Data":"99e304c6af03a3e08278f5797ee6f99e79aaff1289a963780c8099a86643591f"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191236 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" event={"ID":"401bbef2-684c-4f55-b2c7-e6184c789e40","Type":"ContainerStarted","Data":"5c2b1421622aa51b1e3f3309e1cecee04d47b8ec5a2290e918d8137ddcf8b78c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191245 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-67jx5" event={"ID":"401bbef2-684c-4f55-b2c7-e6184c789e40","Type":"ContainerStarted","Data":"62a62c397b340be942f32a53629ca1820e5ed2199aae4350c1b9148fffbcc52d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191254 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerStarted","Data":"75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191263 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerStarted","Data":"388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191272 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" event={"ID":"b94acad3-cf4e-443d-80fb-5e68a4074336","Type":"ContainerStarted","Data":"95fc7c4c4a487643b9831f1cedf5dda283cc70c5afdd39d20b4d5ea8bc0108bd"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191281 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" event={"ID":"b94acad3-cf4e-443d-80fb-5e68a4074336","Type":"ContainerStarted","Data":"cd06e32b994481471c1008a22765ea8fb7d4c0eac4c1085f974725068e466db7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191290 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"881492ede708564c2b50f2504981788dae1af5d233f3feb7510c408faa94d0fe"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191301 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"7ab4fa4e971789d5db1c529b4678cdec74ff9e32562173d88e9c894bbbe80a3b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191309 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" event={"ID":"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7","Type":"ContainerStarted","Data":"60f1d2698bbdc9af90765d1ef46cd020d376aa4c007400334c8fc83e64d3d86f"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191319 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"fa423e54fafba3982d7bb2d5466fcee2c23cbdcb2db478a9c800bb36094dd0d1"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191328 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"fea4d98f3d9db64dd863f1c17ed52c6613cd3bc9028a466c54e0fb69e9d3b0a8"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191332 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191337 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"384c65ce883105e112d84de0e43a4a493c36b10bc529d9576a7501903ba90ca3"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191348 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerDied","Data":"819bab5050551748fadc71568c0e7c229f38c2b2cb38e16a3bd09395d5299f4e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191358 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerDied","Data":"cceb2895b7ad1a9aea1a615553362ea80d4700a89b8411dc29278d45b0d40f09"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191367 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nqqp" event={"ID":"56e11e7e-6946-4e11-bce9-e91a721fe4a7","Type":"ContainerStarted","Data":"7f2851a3eb6c41b727b5c53073d970f5dd84de3034b2055a355a0ab0bcf3b48d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191375 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerStarted","Data":"dc658077d52293b3c4b33ff4dc755cf2b234d7c6150c15f85d599f2e125c3427"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191386 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"7c4e1b361ff558ca25f7a79150dde84f1533aa652ade34de4925ff4983cee4b2"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191396 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"09b799c18c45feaba6859a57b3c549da1772578d33ab2e69691bfdb4a7740bc3"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191405 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerDied","Data":"1b42fcb0b0ae8c854969b1967188fb3b2c0ac7365173440cfbf5c3f93e5315cf"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191413 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" event={"ID":"db164b32-e20e-4d07-a9ae-98720321621d","Type":"ContainerStarted","Data":"90c63e0b66f405ad9ba1342c113ed69565fb8227cabd7f3b8504079a44ce002c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191423 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"b08200bbfa16b2def7e8e435dbba2b2fcca8a8d3de5ace290d9e40ef68f64f02"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191431 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"044eaa58832f79354645bb27892aef22a346fbf4bfe737dea79901ffa64d2090"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191442 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-krv7c" event={"ID":"815fd565-0609-4d8f-ac05-8656f198b008","Type":"ContainerStarted","Data":"874da80b3858b9b5a8a2258c3b83f19f5f0c80010ec82d07a7dc18d61c4292fa"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191453 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"7b9f0eb1c41cef5d8230e9e1038d90bce9d1d6ac13eb84abd28591cfa2cf66a5"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191463 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"624e0a9861955168af6025f0fb5bf70d719c984169b8149f4ff044bbd9836cbd"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191472 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"2e47d8d2ffbca29135c63c0ec58db9d105e81fa73da896958637e9f0815629eb"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191481 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"632cf41c6d751c39c9bc533a8eb31489a926eb05ad69c14fc4cbdd3ab7d57165"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191490 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerDied","Data":"8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191500 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"f37ac8237d1707faf128fbd37cb4fc4383ed09260c056f6f33db8e0a42308015"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191508 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerStarted","Data":"96ba39646fac17d0697e88bae6a2ecb9f089f04e9a05c825a6c18dbce7611ea1"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191641 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerDied","Data":"322f3ad793e93ca7f32b8558fd2506b5cf8b8be4b12165040ac02501040fbe03"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191652 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" event={"ID":"ac523956-c8a3-4794-a1fa-660cd14966bb","Type":"ContainerStarted","Data":"16a0ef8737c1e2416e14cc076fc6b1d7ef645b2043e268561b096173dd7a6b0e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191662 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerStarted","Data":"96340e4adcba39009221d3be0b5592f41b18ec7a6d4f125088b3408673ad95fe"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191672 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerDied","Data":"0238d925fb5b554e7f8df9102a9ba758748ba0abdd9b4e92ab97dadd2793a34a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191682 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerDied","Data":"04665dc4db4c2d82c8d11a97a36abe0b11fe894bbbd6e5c64a1b3a502d59c374"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191710 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6t5lg" event={"ID":"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b","Type":"ContainerStarted","Data":"f661c7de8e4aded6ffb76b6f77c2ac0e5ed6e7e0e7ebfcafe40f9c953ec5ee63"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191722 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191733 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerDied","Data":"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191744 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"f860ea80aed55d2d8aefcd014e94c8e07b481ea1bac54429f957dafad3d193dc"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191753 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"a09a8a648b8b5d27ffa03ef33629e6462dc3a71bc00700334560a54ac0509ef1"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191762 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerDied","Data":"25ae9c9f82c094082383cc214e49a9f1d3d4d26dc8ffcbe8cff3194531736ede"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191772 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" event={"ID":"ec2d22f2-c260-42a6-a9da-ee0f44f42303","Type":"ContainerStarted","Data":"b24cb8b6e833d760382f41e5306d191f11027b327de5f975b19e63833c3ea28b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191780 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"94f6cbcf36ce22a8ad98b49d60bec50375421ad5c3b08a57f781b8f9d633b332"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191790 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerDied","Data":"d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191799 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"813c8ed04b18f307078b38a00cf3865fc1feedea034a383e0342d8429ae20e6b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191808 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerDied","Data":"d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191820 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"66915251-1fdd-40f3-a59b-054776b214df","Type":"ContainerDied","Data":"c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191828 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c753a2a6e010f70aa63ed8c11f23ed59bf96ec555e7e82acdd68bc431c4a37ef" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191837 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"c407babaf75d2857cc7e7f6f987ae592ab0417bd9fa8a7e43b350cf7332b8d44"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191847 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"ceff6be6c2bd2d352cdfcc056386b4f3985ee7a4045231ee2b8afcebd43ff3a7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191855 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"a02e1889bcd7a1cba9295de5ccf81a5e8bc3df65e6e184f470b35e714f23b1f8"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191864 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"0e49be2e23bf477ec14120fb40ddb29719e2b5af3f6beddaffe4770b79d6d46c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191891 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"02f576e5daa548d8e13a03a2b6ed259a4ebcc6364353cb4ddfacfe054ec613fd"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191901 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"9c94225f58476ed65c6fffafd91e571dd0b2ef9f295936cd52d8b1901c360298"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191909 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"2d125229cb85fc818282339c6e80d7cb921f3ddfa9564d713ed3dd5e74ec9a38"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191916 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"54c6e47bd54c96d470a2821fbe979f217369d59cac4fe994745beff2a29276c1"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191926 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerDied","Data":"9caf746e34f3ceb9b7a0c15d058a8c3ef6549037b6840e762c5d26db1b3afa1f"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191936 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" event={"ID":"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9","Type":"ContainerStarted","Data":"b3e77cc21c0092533e2573fd7bc828eb1f314192461aa4ad0d7a1a79afb2a5b9"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191945 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerStarted","Data":"780095bbe85e78933cef6be83dd1325e378e4033a839880b601dba51dbb6eb8a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191954 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerDied","Data":"ad08463ed7ab691e56f4dfe0288960876b6a58370e90937b6cc2efea5e0f4441"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191963 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" event={"ID":"e76bc134-2a88-4f92-9aa7-f6854941b98f","Type":"ContainerStarted","Data":"7f9bd3b95fa9a96d599ef5d38ab2c65bfd39d0c75616669dcd2a59a811c0de79"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191972 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"46b0fd729a946db9b13eac5c57c40b40e4b8a56cd0aeaad608c0b0bcae727675"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191983 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"4ade0408e709b8d3bfa126728a922decfde81b90bd3f67b5bee03661da1d8a83"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.191993 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"3fcfcac3d94a68502eedf27bec2a63baba722b253947b783bc8a405ac2ab5cd7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192001 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerStarted","Data":"b19f92b1598dbc89d7fa4f28fc4aac7b76c5f4ec1d5d7efb6ada3eb88a730a6f"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192011 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerDied","Data":"ad3a46887dab7ea3bfa412ad6cf5418fcbb18c2c14aa2dc59012eeca70fc7d9a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192020 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-brq9l" event={"ID":"614f0a0f-5853-4cf6-bd3d-174141f0f1e2","Type":"ContainerStarted","Data":"a0d7955b7085045599d0a7ea45ff20f907bc225ec27c46ed3dcc33b59207b912"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192158 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerDied","Data":"6c847624822fb2ae11b6027b5155999eb848a04181b2d105ba183b9e9a68d9b4"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192171 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4217b755-ca87-45cf-9e52-7b2681660f41","Type":"ContainerDied","Data":"dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192181 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0f970c88c1737a47be41b249ed6c2014805b33e5ea7b0be6fb9cb719bf9d5b" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192195 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" event={"ID":"d70f4efb-e61a-4e88-a271-2f4af21ecdf3","Type":"ContainerStarted","Data":"e019ef0a1aaa25b302b6691d82feab7cd7bb9ac300d9fa2874c54e4a866f472b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192139 23041 scope.go:117] "RemoveContainer" containerID="915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192217 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" event={"ID":"d70f4efb-e61a-4e88-a271-2f4af21ecdf3","Type":"ContainerStarted","Data":"e0a85ed7ebd2e07f65048b3255f6189a3d4d65a56d9c1df41b7b05764ef3bd29"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192230 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerDied","Data":"f70bb9a5f0e3f9b911feb28654c30e151d3e1fb5d9549e6e2016049387b17fb2"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192240 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21dd42b1-2628-4a24-97e7-6759888ed316","Type":"ContainerDied","Data":"f81e16a049afccd7df86e2ab910ff92e4bea5bed8e76ac4e62191e1c15f7228a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192249 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81e16a049afccd7df86e2ab910ff92e4bea5bed8e76ac4e62191e1c15f7228a" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192257 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"52be315580333e096a5c394dfb3b50ff79852b6010007ad83ccb2074b85db43b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192266 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"f4026f3f82c087e6f1133285b0314080fd77636b3f28a79b0a59695dc64ab709"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192274 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfjzg" event={"ID":"e302bc0b-7560-4f84-813f-d966c2dbe47c","Type":"ContainerStarted","Data":"4e0af367cee5aa7ace0374f562c3ebde99ff63afaf075a5612625be33276de36"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192283 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"0c63daa306d3bdb05b05608e24f1b29d4e891aa0f9db9588343aba567dfea148"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192292 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"103608f45f7694c0e9140e9dcbf75b86f00880d60c9896b112d4ee32ecef5e6c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192300 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" event={"ID":"2ac55f03-dd6f-4ead-bacc-c69aeca146dc","Type":"ContainerStarted","Data":"0ce2140e8d5f4ac383fcfe274d59d3771538ece4764c91b8cb4e301d3fe26bbf"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192340 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"e06e3b0cd0c498549672bad1fef5caf7eaac361c9fc1607113d2582022a9ec7a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192353 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"849f8c9c0130860af59ecc5126efd43b717473a9bed214260e499c901acfe39b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192362 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" event={"ID":"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0","Type":"ContainerStarted","Data":"1cc242574263ef7c849076452db10d6f32fa75aeb983a9e0f9150bc85db0911a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192371 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"787fa634ee36f327997b592447e9aadba40183c4e7e4d25f5519ae9957121e6e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192382 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"4262f462df3c892c070c1769f302b6c7878bc5f82d5342928245d488b3431f6d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192390 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"0e06c006df1e1e63e0f6188a23b5e393fde4aa4984ad610de00e8c675da914c7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192397 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ea5ec65ba12dfaaa4f58b3b64547a3d98d2937c3aa58a7bc6dc14040003a38a9"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192406 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"d8889d6936248c826e33628006d790b900bbbcacc9529b4c35a79aa987893d39"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192414 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"620aae0686e0d0747f86c66dccb5f833f425852d851da5976e803bb0ce3011ba"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192424 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"c8de3ced39581b8ad5acd40157b9e893206291d5fd34e7516c2c1b0358ea17a6"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192440 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"182e67e6b82b83c4d47d4c01d3dcbdede2056c9bcdcf8367c8a6959d0eeac8ea"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192449 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ce67cd1e37e90c976b5eb1d98a8adbdd3c36380a0d4d75edb38584db8eeda1f5"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192457 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192466 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192477 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192486 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"0bdf70a6acef734c900a623db8a8cd37b2a2e6c50fe84f9293c0fc0c5705c71d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192495 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"32dcf127d578ad6c3485b23863e0464ac0748c6e4e51332f9bfa899ee478383c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192503 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"7861ba3338916d9e9552052b5b66db2f7a34066b6d4805406b4ac88bb57796dc"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192515 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerDied","Data":"95c20172ebbb05524877a835e30132f4f70ded4813cb99373d344901a324181d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192524 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bx9dn" event={"ID":"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25","Type":"ContainerStarted","Data":"88364d0cec48d65744e1beec8c11b2e217cd014d5b9879cec4ffa6513fb0fe68"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192532 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" event={"ID":"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8","Type":"ContainerStarted","Data":"ba271e81a6fd420c562722e45c96eb9a2bb2cadcb564df2912b43989b4296570"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192541 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" event={"ID":"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8","Type":"ContainerStarted","Data":"c9dc377ca2fdac8594f81d6df8e7c069a1b5189bee06d288ed063183ce36a834"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192550 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerStarted","Data":"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192559 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerStarted","Data":"e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192568 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"259daa6bdeec002c66ab5644c463905cc1e9ced2ca36801084d0b2095f73b07b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192577 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"9fbf00cbaa1fd82a7fe4efbcd60b1cc35a5cc55ea94035411c2b6572009208c7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192585 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"562107d3f93627171c40e2da601929ce58908bf598b7af4d1af0d420323bb2a7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192594 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" event={"ID":"ae061e84-5e6a-415c-a735-fa14add7318a","Type":"ContainerStarted","Data":"c6dfb6a757149a4059a400948a504adf47ce562d49ab223062b37eafa8275000"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192603 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerStarted","Data":"bb5c6100970e1f98de7541d0e14fa48c4311bd4754ce859444be673afbee41d8"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192612 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerDied","Data":"f1165833632b857988bef725397f89c163ab44ca5ba27c1f2f567224751fe8ad"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192621 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerDied","Data":"5c5fe88ca84d34535298e53e21f41989f9811c3fb403419a0f79b41f340064f5"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192630 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fjw9" event={"ID":"55c8d406-5448-4056-ab3c-c8399217c024","Type":"ContainerStarted","Data":"ff2ce08940304b5b606944a45d5884c507d106440aae4429902a0d2f21368070"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192638 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"aa5ad4a36fb34e3b8448dce44870bd90294e9dfdbc77705a2449657049d35017"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192649 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"3c9cc0ea8b8c8c3c9346819b130170a92470b9a87fb7c1462d7680ef7197ef47"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192657 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"013b718ae531bd264f0d08436f90a352773f432fb8153c8f5baaf771bc43f460"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192665 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c4b3dad7b177ddc417477ab1f0d5f78969f5ec394aa11addfa7a3ce44aa14aed"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192674 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerStarted","Data":"1e770f05b7d4f3abd180562cc940e1a0486ee998d3fc21227af26eb82314570e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192683 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerDied","Data":"829e088d3beb6bbaa940412e9e43d8b3ba4f7b2b62947bd685d43db99e68005b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192691 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" event={"ID":"531e9339-968c-47bf-b8ea-c44d9ceef4b3","Type":"ContainerStarted","Data":"e21ecaa295b51fd30f3e30feccdaaffb5d26d81a05305635fb9f903bb9b8a90e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192700 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wkt98" event={"ID":"a68ad726-392e-4a7a-a384-409108df9c8b","Type":"ContainerStarted","Data":"51d72f735ac2d22ad572e2bfd6c4c3d9ef60ea8d95d8d615afffbd72430f0283"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192708 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wkt98" event={"ID":"a68ad726-392e-4a7a-a384-409108df9c8b","Type":"ContainerStarted","Data":"8e70531b1dbd5c8e6c17416c362305f1eea7b7b018f96a22eb1f0bb98b78a034"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192716 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerDied","Data":"1a0afc6f5f43ae0c03dad4b66580da08dbfc175218d88b6ca2b45fa8794895ad"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192731 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"55216a56-677a-4f28-a530-77d44bded8a2","Type":"ContainerDied","Data":"2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192739 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d79f79d79186c94eacd319b18a19e02c3739e81bc2d84288b2f6f2697c49ad7" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192746 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerStarted","Data":"a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192755 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerDied","Data":"92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192763 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerStarted","Data":"ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192772 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"171aa9f17bab1693340df88dc9687b17839bec3452bff1e75aeedd920e40b060"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192781 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerDied","Data":"c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192791 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"cd375a476d29bf57c7b9e43c8cd23f02bf2bb9a153d14c3da6003737a55dbb0d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192798 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"c70a49bdf7ce76b550fe89e6bb326288e459f3c83c699e27a995807b0355a90e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192809 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerStarted","Data":"da7f059bc7425c70bc4a951221ce223000707cc405db21efd57cd77b538e3498"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192818 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerStarted","Data":"78f167041d0e1e5dfadee1e9a27a600120c1dc54a22d62ff9910e1942faef008"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192826 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"117c49c3263ee766fe1829d23251703c5640786c8cddbb7c33f70514fe438945"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192836 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"348aea2a915fd68a226048223a20a87a7f16c78c005410713b0290068a8f6dc3"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192844 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"324a3f66919d93d357f8f2bce22ca197a2c40c573bb476ff1dafbf1389ca9177"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192852 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" event={"ID":"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65","Type":"ContainerStarted","Data":"55b01a8834cc0e66e80c4742dda9dcd76cc7d21fc646a73322aabbcb9e7a815d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192862 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192872 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192881 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192889 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192900 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"635c9c2985fac1a14beab73539e4661fa51cd796fbfb9d8b1faa5701a4b68e88"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192908 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerStarted","Data":"67889792ebb5d4e854f7fdede5676d644567a2db7df33390da8134c0d480ee11"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192917 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"ee1bfab2130a9c72df8adc63c3382589fac2b085c9ce4752d92d10429ef61f76"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192927 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"c7031bd4261187339ddcdbbf17642c8a944a5d40ae330e696f51959987e70da4"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192937 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"d4bd6afbd87673cd3e0a5753c92817e5f63b4859d724983c90d010a8db1fe80e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192947 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"7264af89c3bcf80c9a189b3bddcd203436764c691f9c5c52533e7f598dddfac4"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192955 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"48c6a8c71ab87bd002a24ce7589e179bd20778d506e7cd037500b0c5771c655a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192964 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerDied","Data":"f8e210245fcf5757a0858988b80936bb56e15ab6a7c3881f301f7f4cb8a8f550"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192974 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5jxb" event={"ID":"7ad8b9ea-ba1c-4507-9b70-ce2da170d480","Type":"ContainerStarted","Data":"7a5857552aa1339fd1907b2666246b77b57ec97f6cccfaf339c644659664d85c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.192983 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerStarted","Data":"362e2ee6abee655de6dfb5a64eddbe14fe4be437a3b12293690dd8327410ffad"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193004 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerDied","Data":"08c17f5be4c6cd32671af564801dff89f871520231b6fd523ba49a05d5c50b3c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193014 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" event={"ID":"365dc4ac-fbc8-4589-a799-8327b3ebd0a5","Type":"ContainerStarted","Data":"8f1055f3dc7c655a333a3fa311c8f94b2ceda0b473d7673f490a6875c1158919"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193022 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" event={"ID":"786e30f1-d30a-43e1-85cb-d8ea1495422e","Type":"ContainerStarted","Data":"cdc0e9685b65a455abbaad494c4a6513ad0b9438ee9d2cc8a13ca432a7107cef"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193032 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" event={"ID":"786e30f1-d30a-43e1-85cb-d8ea1495422e","Type":"ContainerStarted","Data":"8d0d8e23ae25ced02b7cdc0775a6f94c8fcc52f337331a56804c82208fb25ced"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193041 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"e8ef418892e89b7f3833d29c636f71f3f5b9cf6ffda7232c93e00417ddde5f8d"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193050 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"4c55f1200add2af42f95d0106d6d887be04568b435704100c4cfbfdbdabd7d73"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193060 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerDied","Data":"a8112b99efb51a20fdb91fac566b95eaf004df0ff11f9408140898bfa467ea7c"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193069 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" event={"ID":"2b1a69b5-c946-495d-ae02-c56f788279e8","Type":"ContainerStarted","Data":"7bcc330c034a7032e8bd43ea29408b50fdad12339c2d89f6fc2a01fc9d43af95"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193077 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-v949k" event={"ID":"c4cab26a-fe31-4cf2-a938-b280f1934d99","Type":"ContainerDied","Data":"d6af0d3578bc6ae0d4e0f5d4dbddc52dc70217cef15e030aab47b2704363ffe2"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193088 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-v949k" event={"ID":"c4cab26a-fe31-4cf2-a938-b280f1934d99","Type":"ContainerDied","Data":"48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193096 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48589610dea61d404b3894a555948d67264374c9f204d16a7ec77740894d856e" Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193103 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w5fjg" event={"ID":"1f63cb2f-779f-4fde-bf92-cf0414844a77","Type":"ContainerStarted","Data":"36185d93a870a181655e4436861864047a9af33496ef86d20302731ff777317a"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193112 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w5fjg" event={"ID":"1f63cb2f-779f-4fde-bf92-cf0414844a77","Type":"ContainerStarted","Data":"fd2c01cdd304d39e575ca69d83c243fee0060006da5d42ff4d10f498f54d4b60"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193123 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"0bfb5bceaa149162c15931fa6c19adc19bff0abfffe5914519da3718cfa8c3bf"} Mar 08 00:31:51.201283 master-0 kubenswrapper[23041]: I0308 00:31:51.193132 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"594372803c90fd234334b17b9df7ae74ff21542e2952be96f9e083d29faca78a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193142 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" event={"ID":"c7097f64-1709-4f76-a725-5a6c6cc5919b","Type":"ContainerStarted","Data":"1cddeda960c60a71faf688d26e861f0212c8666ffc3672e89502d43761b93cd2"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193150 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"10e105765ad69984ad662df10f70f89fc3258bff9a6fa6179599d2b62b4cdd81"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193159 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193167 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"4ba757467f3e4fadf37ce1d9a907a1771ea5751b999a31bf5bb5f0ab9351aa7f"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193175 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerStarted","Data":"40763ecf359c193fdc57eccfc3f99287edfc631f03df7363e0563b373121c528"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193185 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerStarted","Data":"f5e085e04fcec71a7384a042b53e9f6db9dd0fc0eed95804aa4550ea011dc40a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193193 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"c756595c785c16416805ae901384336bd79f4ee2a5921d1dafe30a90cfdb5b66"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193218 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerDied","Data":"52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193228 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"53e5b4e15707abe8f63034abbcefc6b4a23fa99d2992c497080fcbc4818458e4"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193236 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"96c247b918e2a9450964a3ea1162342c6ccc7c2330777e8d76f1128e74c9ecae"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193246 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"2a75a237fef308cfc9e8dc829c307d2c38c0fdad09816e4ff80123079e47f8b1"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193255 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"6703d449ef58e82f6711f4fb4077c407ce4e8f1fc186664220b3722e268d3aa7"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193263 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"6b085935f4ebb70afc5a958163f7053b9a42b89c690b039c32d56dcc51668fae"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193272 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"79898c1495b01b774fa3705ded4d271b0617e5b224dd28c48dac5c9a238260f3"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193280 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"9e323e1fa7d402b7efb1afca10f5c1139ebd69bd1d0ac77477dbe3652009da9a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193289 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"9a99ff1aaac045cf6ae25e4ec837d836f7e6fbd938939ae00536d553ed630283"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193298 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" event={"ID":"1da0c222-4b59-424f-9817-48673083df00","Type":"ContainerStarted","Data":"3824dde14e6e2df8fdeaf0d3586d91846c024a16aa684e52f4497805143ba494"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193307 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"a93852bdddf78dff65ecf8b8ffc6457b3e060c3ee09b055521d9a24e262b9408"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193315 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerDied","Data":"dc923284309376403cb95e44ae08001b8c778273ed731a0f98310a7899bb3d2d"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193325 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" event={"ID":"58333089-2456-4a25-8ba7-6d557eefa177","Type":"ContainerStarted","Data":"733e43352408d7f83022f1e2789901cb1e3830089ecad3dc5ac2ffbae10f60ad"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193334 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerStarted","Data":"dac2b4107815aa7d9649b2815ef78f301ab7916075e5059aa3a49b2c981a36fe"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193343 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerDied","Data":"031c64f86b4914d8ed85469cff79e56b7a2e1cbd518e0fd70f47211192095f45"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193352 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerDied","Data":"fe58071840dc6349204161e59ca64944f26b1ff66582767c1106a706a17472e1"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193361 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j9zs" event={"ID":"b2548aca-4a9d-4670-a60a-0d6361d1c441","Type":"ContainerStarted","Data":"28355b7f7227fe6a0abd3c3085ac0299e8c24ec4f49691a081d1fe68b8bde287"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193370 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"55fbbec4f49e2e61889c0fced169d57405e19efe1cb7fb53095eac0414a18aa2"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193416 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"7ded812b6494fa846c4ec3519032a6a79758aaa664ea0250e508e50f52908363"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193424 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" event={"ID":"e78057cd-5120-4a12-934d-9fed51e1bdc0","Type":"ContainerStarted","Data":"c7b839bc1440105484eefd605ce2dd49ac3adf1072ca232cf569d9cfecdcc1f4"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193433 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"9640b5a39ba1c8d22970de560d1644963302e95dae8ebd4e31dc3deaa2d4d495"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193442 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerDied","Data":"04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193451 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"a2af0127ad556015336cd256817276cc9d6a8a08dbbf295a1bf7821d7309d19c"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193462 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"b2496d08ba7d24c47b88064d6a60a25e9b169662cfe39cc7b5569d25f4f5e236"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193471 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"6eaa4eebadf626880d254857e0b5071188feb8436fd6122d3cce0a00f572ec73"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193480 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerDied","Data":"8e5eb8c3a997190fe55fe0f74af3ee5e0a5480af9438a723ead360bc861186ec"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193490 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" event={"ID":"1751db13-b792-43e2-8459-d1d4a0164dfb","Type":"ContainerStarted","Data":"5993f0db8eb571541ffd45db324c8f25d80729c838e2d7b2910b9b88c3eb3de6"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193500 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"4bf845493478fab338d4b9ab87cadf5b607d6c9eebb501f29c76a34495978f4a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193509 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"b6fa88efbe7764411e628b9931e04b59a0f6aad2f1656156d14674b5a960082d"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193517 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"f3cab32904f1f3dc9eae1dc7b47ec8d51b63661baeb9517ad66b59248d52dfef"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193525 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193534 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193542 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193551 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193559 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193568 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193577 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"a2271776808f809754ea9910dbf17284aca2a88f19582f5163627216da7a3ba8"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193585 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"b2272201017b4214b0d3b2d37079305086623f271eb44fd6320c5be45bef2f26"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193594 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"1581c52c50b103d88a3f7e59b35292fc1d1154d3b7d7ca2cbb56b6eef1ed3e4b"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193602 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" event={"ID":"e3f42081-387d-4798-b981-ac232e851bb4","Type":"ContainerStarted","Data":"dc6431dd72c27cd0cc50f525ef4684b1138ca71254e30382dcc7425a8c604797"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193611 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" event={"ID":"0d0cb126-341c-4215-ad2e-a008193cc0b5","Type":"ContainerStarted","Data":"6e25dc9a5f14568083319c0b4bbd12c19766fcb10a82c2e247c421c6684c8ec8"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193620 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" event={"ID":"0d0cb126-341c-4215-ad2e-a008193cc0b5","Type":"ContainerStarted","Data":"27f4354a5f2d519381a516d1dc4209edc63d8a7a92b44222c7f0143dbf2a908f"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193628 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" event={"ID":"4ad37f40-c533-4a1e-882a-2e0973eff86d","Type":"ContainerStarted","Data":"fec761ba111693d32c9163242c81a699413cc2198220381020f06b4d5f0d4c4e"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193637 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" event={"ID":"4ad37f40-c533-4a1e-882a-2e0973eff86d","Type":"ContainerStarted","Data":"036c8d5e00b57ec77b752ae2bc46eb3d7ff2904d9ebc488665656ab787ecd5a5"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193645 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"f54577a28417110d9e7f61afc4cf54e4382b8b583a37c474d5a4196b61d34559"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193655 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"82710a6421f9ccb619f042e68b9675e392f987444180b3a6a9731863e8381221"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193663 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" event={"ID":"03f4bafb-c270-428a-bacf-8a424b3d1a05","Type":"ContainerStarted","Data":"4297b6122cd668a28e80b28ce2f18556120772700fd7e586762ab1c6f70eea07"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193671 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"8024d8e07c10843d58afa6b354d719252942b7cc674963d8b1fab2a5ad838405"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193681 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"91975b539efd51be35527a0d8a61481b74eddf77df1b9a337c3002feaa1bf444"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193689 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" event={"ID":"e237ed52-5561-44c5-bcb1-de62691d6431","Type":"ContainerStarted","Data":"aaafa12a616f7369af11bbeebe18962338e3a83e1b72c0a692864a7176225e0a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193697 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerStarted","Data":"75840c04f6b695db51ec61cebbf998b4b3060ea46b87261c880157ccbd62f9ba"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193706 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerDied","Data":"459a84ed9e1a3d8f522635c123baf95a666dd88b0c40648d94dbbfdfad737d00"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193715 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" event={"ID":"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab","Type":"ContainerStarted","Data":"31406fc5b2c5472ac716e4c8cdca7909539075e5cc335f68e4b469dfc56a38f1"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193724 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" event={"ID":"4f5539c1-fb87-42d6-b735-6de53421bb6b","Type":"ContainerStarted","Data":"2d85b8a41ac1a5d7ec38487553cf098502219f3e61e7670ec8b3fc64cf28df17"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193733 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" event={"ID":"4f5539c1-fb87-42d6-b735-6de53421bb6b","Type":"ContainerStarted","Data":"79a6fb0d44533a4c06691dbc28101325df1e65724145bd5bed4068656b402865"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193743 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"dc254aaf3bd5aa2a3c6e69f8abd5a98d092e318f7ea622432462747a16cce142"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193752 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerDied","Data":"f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193762 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"ef1557fdf295530164fddc6e32be204cb91e899b1392304c5810a0afd29e77ff"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193770 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"bb8dfd749824585a5971cc6ceb0409c06052a233c71d6156a9b5d20725022dcf"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193778 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193787 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"302cab9bf3dbf255daeb9370ab65a4f19b214019a7009e2da9e307530afd287e"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193804 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" event={"ID":"6d770808-d390-41c1-a9d9-fc12b99fa9a9","Type":"ContainerStarted","Data":"14725fd0b5b18b46ce9bdb373030cfe8e6d0b6e93e752dd6c68eaa4f70173138"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193812 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" event={"ID":"6d770808-d390-41c1-a9d9-fc12b99fa9a9","Type":"ContainerStarted","Data":"c0511cfa10b44562c51d17ac29eccf8315f318be9fcd77f37c978f1bbeeb8000"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193821 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerStarted","Data":"f74d256abcdb5398186b869309f30f30a8ba6d7a0454838bd1b4e98ad498b4cd"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193829 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerStarted","Data":"a68be094b9128e17cfcb273f66f3867ebf81ebb395668f57f098ee489c8a0035"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193838 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfnqf" event={"ID":"0e52cbdc-1d46-4cc9-85ee-535aa449992f","Type":"ContainerStarted","Data":"0cd7d1d536e3e73fb9ed25ec4d69ad5db01a51017e617e72f4fa58f319d499f9"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193846 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfnqf" event={"ID":"0e52cbdc-1d46-4cc9-85ee-535aa449992f","Type":"ContainerStarted","Data":"e4c3df22ea5b25cdb4fb25d7746e4d1c319e0fa007db70463be2670c88f00662"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193854 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"62a90dd1c822377c4aa48689f26940e9273c8eaf2e5b09cbf6dadaba768ab7d5"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193864 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"288a9f605fdc9bb30bb45ee47783409a88bbd8f20083c4f59dc94085a87e4e3b"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193872 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerDied","Data":"1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193882 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"773f19015576d673121563aa615f577b8c93848d40403e9cc4d2c3a87bec1183"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.193891 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4da316e5c8941b4baace90ce20646816051133ec406a841a63f02453e48ca25a" Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.194082 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerStarted","Data":"76a35028a8d9b23a680ded5da7f57ea40c69742d5b697c8b44c79baa58b379ed"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.194748 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerDied","Data":"5883c7f053a567c57162616ec25d9b4c38f468aaa6a93afc0931684514320848"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.194766 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" event={"ID":"b100ce12-965e-409e-8cdb-8f99ef51a82b","Type":"ContainerStarted","Data":"8493b96f9e2317bb2258ca024aff023f604de77234681da55a05bccbc932bc9a"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.194775 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerDied","Data":"1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d"} Mar 08 00:31:51.215342 master-0 kubenswrapper[23041]: I0308 00:31:51.198440 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:31:51.219247 master-0 kubenswrapper[23041]: I0308 00:31:51.216540 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:31:51.221846 master-0 kubenswrapper[23041]: I0308 00:31:51.221677 23041 scope.go:117] "RemoveContainer" containerID="915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" Mar 08 00:31:51.225402 master-0 kubenswrapper[23041]: E0308 00:31:51.225356 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b\": container with ID starting with 915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b not found: ID does not exist" containerID="915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b" Mar 08 00:31:51.225569 master-0 kubenswrapper[23041]: I0308 00:31:51.225409 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b"} err="failed to get container status \"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b\": rpc error: code = NotFound desc = could not find container \"915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b\": container with ID starting with 915f71c7c1f314a02b658e63c673b9b34d83af6828634db211d8fa70c691f01b not found: ID does not exist" Mar 08 00:31:51.227171 master-0 kubenswrapper[23041]: I0308 00:31:51.227138 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:31:51.227257 master-0 kubenswrapper[23041]: I0308 00:31:51.227175 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.227257 master-0 kubenswrapper[23041]: I0308 00:31:51.227222 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:51.227370 master-0 kubenswrapper[23041]: I0308 00:31:51.227346 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.227583 master-0 kubenswrapper[23041]: I0308 00:31:51.227507 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.227629 master-0 kubenswrapper[23041]: I0308 00:31:51.227613 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.227725 master-0 kubenswrapper[23041]: I0308 00:31:51.227695 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.227760 master-0 kubenswrapper[23041]: I0308 00:31:51.227723 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.227793 master-0 kubenswrapper[23041]: I0308 00:31:51.227763 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.227821 master-0 kubenswrapper[23041]: I0308 00:31:51.227796 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.227863 master-0 kubenswrapper[23041]: I0308 00:31:51.227845 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.227897 master-0 kubenswrapper[23041]: I0308 00:31:51.227875 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.227926 master-0 kubenswrapper[23041]: I0308 00:31:51.227917 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.227956 master-0 kubenswrapper[23041]: I0308 00:31:51.227939 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:51.227988 master-0 kubenswrapper[23041]: I0308 00:31:51.227961 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.228021 master-0 kubenswrapper[23041]: I0308 00:31:51.228006 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:51.228059 master-0 kubenswrapper[23041]: I0308 00:31:51.228029 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22cn\" (UniqueName: \"kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:31:51.228100 master-0 kubenswrapper[23041]: I0308 00:31:51.228070 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:51.228133 master-0 kubenswrapper[23041]: I0308 00:31:51.228099 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:51.228166 master-0 kubenswrapper[23041]: I0308 00:31:51.228155 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.228289 master-0 kubenswrapper[23041]: I0308 00:31:51.228185 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.228289 master-0 kubenswrapper[23041]: I0308 00:31:51.228247 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/db164b32-e20e-4d07-a9ae-98720321621d-operand-assets\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:51.228382 master-0 kubenswrapper[23041]: I0308 00:31:51.228268 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5kd\" (UniqueName: \"kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:51.228418 master-0 kubenswrapper[23041]: I0308 00:31:51.228394 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmgb\" (UniqueName: \"kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:51.228475 master-0 kubenswrapper[23041]: I0308 00:31:51.228453 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0a3c84-98bb-4915-9010-d66fcbeafe09-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:51.228555 master-0 kubenswrapper[23041]: I0308 00:31:51.228520 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:51.228589 master-0 kubenswrapper[23041]: I0308 00:31:51.228569 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t99pg\" (UniqueName: \"kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:51.228622 master-0 kubenswrapper[23041]: I0308 00:31:51.228588 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.228622 master-0 kubenswrapper[23041]: I0308 00:31:51.228610 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:51.228682 master-0 kubenswrapper[23041]: I0308 00:31:51.228629 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcl7q\" (UniqueName: \"kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:51.228682 master-0 kubenswrapper[23041]: I0308 00:31:51.228638 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1abf904b-0b8d-4d61-8231-0e8d00933192-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.228682 master-0 kubenswrapper[23041]: I0308 00:31:51.228654 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.228762 master-0 kubenswrapper[23041]: I0308 00:31:51.228647 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.228793 master-0 kubenswrapper[23041]: I0308 00:31:51.228776 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.229025 master-0 kubenswrapper[23041]: I0308 00:31:51.228961 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:31:51.229129 master-0 kubenswrapper[23041]: I0308 00:31:51.229036 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:51.229129 master-0 kubenswrapper[23041]: I0308 00:31:51.229043 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jd2n\" (UniqueName: \"kubernetes.io/projected/b22c3046-5193-4c1d-91c0-7c15745265be-kube-api-access-2jd2n\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.229129 master-0 kubenswrapper[23041]: I0308 00:31:51.229103 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-audit-log\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.229347 master-0 kubenswrapper[23041]: I0308 00:31:51.229129 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.229347 master-0 kubenswrapper[23041]: I0308 00:31:51.229186 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfg9\" (UniqueName: \"kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.229347 master-0 kubenswrapper[23041]: I0308 00:31:51.229246 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:51.229347 master-0 kubenswrapper[23041]: I0308 00:31:51.229274 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.229347 master-0 kubenswrapper[23041]: I0308 00:31:51.229333 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.229484 master-0 kubenswrapper[23041]: I0308 00:31:51.229358 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:31:51.229484 master-0 kubenswrapper[23041]: I0308 00:31:51.229406 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:51.229484 master-0 kubenswrapper[23041]: I0308 00:31:51.229432 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.229484 master-0 kubenswrapper[23041]: I0308 00:31:51.229479 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229504 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229553 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8zb\" (UniqueName: \"kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229580 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229626 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229657 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.229706 master-0 kubenswrapper[23041]: I0308 00:31:51.229685 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.230915 master-0 kubenswrapper[23041]: I0308 00:31:51.229740 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txt48\" (UniqueName: \"kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:51.230915 master-0 kubenswrapper[23041]: I0308 00:31:51.230834 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.230915 master-0 kubenswrapper[23041]: I0308 00:31:51.230908 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.230997 master-0 kubenswrapper[23041]: I0308 00:31:51.230931 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc78c\" (UniqueName: \"kubernetes.io/projected/795e6115-95cc-4c0a-a407-e0a6f14118e5-kube-api-access-kc78c\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.230997 master-0 kubenswrapper[23041]: I0308 00:31:51.230976 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.231053 master-0 kubenswrapper[23041]: I0308 00:31:51.231004 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8sl\" (UniqueName: \"kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:31:51.231053 master-0 kubenswrapper[23041]: I0308 00:31:51.231049 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.231107 master-0 kubenswrapper[23041]: I0308 00:31:51.231074 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.231343 master-0 kubenswrapper[23041]: I0308 00:31:51.231315 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d01c21a1-6c2c-49a7-9d85-254662851838-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.231382 master-0 kubenswrapper[23041]: I0308 00:31:51.231320 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac523956-c8a3-4794-a1fa-660cd14966bb-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:51.231572 master-0 kubenswrapper[23041]: I0308 00:31:51.231455 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-textfile\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.231633 master-0 kubenswrapper[23041]: I0308 00:31:51.231607 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ae061e84-5e6a-415c-a735-fa14add7318a-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.231698 master-0 kubenswrapper[23041]: I0308 00:31:51.231643 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.231777 master-0 kubenswrapper[23041]: I0308 00:31:51.231732 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.231811 master-0 kubenswrapper[23041]: I0308 00:31:51.231792 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.231866 master-0 kubenswrapper[23041]: I0308 00:31:51.231819 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.231906 master-0 kubenswrapper[23041]: I0308 00:31:51.231880 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:51.231961 master-0 kubenswrapper[23041]: I0308 00:31:51.231941 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:51.231996 master-0 kubenswrapper[23041]: I0308 00:31:51.231977 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6w4\" (UniqueName: \"kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:51.232055 master-0 kubenswrapper[23041]: I0308 00:31:51.232032 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:51.232132 master-0 kubenswrapper[23041]: I0308 00:31:51.232094 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:51.232184 master-0 kubenswrapper[23041]: I0308 00:31:51.232149 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:51.232238 master-0 kubenswrapper[23041]: I0308 00:31:51.232220 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03f4bafb-c270-428a-bacf-8a424b3d1a05-metrics-tls\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:31:51.232269 master-0 kubenswrapper[23041]: I0308 00:31:51.232248 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.232301 master-0 kubenswrapper[23041]: I0308 00:31:51.232279 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.232344 master-0 kubenswrapper[23041]: I0308 00:31:51.231995 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-metrics-tls\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.232593 master-0 kubenswrapper[23041]: I0308 00:31:51.232561 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec2d22f2-c260-42a6-a9da-ee0f44f42303-metrics-tls\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.232682 master-0 kubenswrapper[23041]: I0308 00:31:51.232652 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:51.232682 master-0 kubenswrapper[23041]: I0308 00:31:51.232663 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ce2ea7-bd25-4294-8f3a-11ce53577830-config\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:51.233032 master-0 kubenswrapper[23041]: I0308 00:31:51.232999 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.233076 master-0 kubenswrapper[23041]: I0308 00:31:51.233037 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.233140 master-0 kubenswrapper[23041]: I0308 00:31:51.233107 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9fv4\" (UniqueName: \"kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.233179 master-0 kubenswrapper[23041]: I0308 00:31:51.233149 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:51.233227 master-0 kubenswrapper[23041]: I0308 00:31:51.233174 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b1a69b5-c946-495d-ae02-c56f788279e8-serving-cert\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:51.233227 master-0 kubenswrapper[23041]: I0308 00:31:51.233183 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:51.233227 master-0 kubenswrapper[23041]: I0308 00:31:51.233213 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:51.233392 master-0 kubenswrapper[23041]: I0308 00:31:51.233238 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.233392 master-0 kubenswrapper[23041]: I0308 00:31:51.233287 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.233392 master-0 kubenswrapper[23041]: I0308 00:31:51.233313 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.233580 master-0 kubenswrapper[23041]: E0308 00:31:51.233547 23041 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.233613 master-0 kubenswrapper[23041]: I0308 00:31:51.233592 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.233678 master-0 kubenswrapper[23041]: I0308 00:31:51.233660 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.233707 master-0 kubenswrapper[23041]: I0308 00:31:51.233686 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.233747 master-0 kubenswrapper[23041]: I0308 00:31:51.233726 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.233779 master-0 kubenswrapper[23041]: I0308 00:31:51.233748 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.233779 master-0 kubenswrapper[23041]: I0308 00:31:51.233767 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:51.233838 master-0 kubenswrapper[23041]: I0308 00:31:51.233812 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6xw\" (UniqueName: \"kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:51.233838 master-0 kubenswrapper[23041]: I0308 00:31:51.233832 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.233900 master-0 kubenswrapper[23041]: I0308 00:31:51.233849 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:51.233900 master-0 kubenswrapper[23041]: I0308 00:31:51.233885 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.234534 master-0 kubenswrapper[23041]: I0308 00:31:51.233902 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4t5k\" (UniqueName: \"kubernetes.io/projected/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-kube-api-access-r4t5k\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.234534 master-0 kubenswrapper[23041]: I0308 00:31:51.233919 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-config\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.234534 master-0 kubenswrapper[23041]: I0308 00:31:51.233936 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:51.234534 master-0 kubenswrapper[23041]: I0308 00:31:51.232889 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-config\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.234758 master-0 kubenswrapper[23041]: I0308 00:31:51.234724 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/1abf904b-0b8d-4d61-8231-0e8d00933192-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.234865 master-0 kubenswrapper[23041]: I0308 00:31:51.234832 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvvvn\" (UniqueName: \"kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:51.234914 master-0 kubenswrapper[23041]: I0308 00:31:51.234893 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.234944 master-0 kubenswrapper[23041]: I0308 00:31:51.234921 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.234979 master-0 kubenswrapper[23041]: I0308 00:31:51.234964 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:51.235008 master-0 kubenswrapper[23041]: I0308 00:31:51.234984 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qznbf\" (UniqueName: \"kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.235008 master-0 kubenswrapper[23041]: I0308 00:31:51.235004 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:51.235070 master-0 kubenswrapper[23041]: I0308 00:31:51.235041 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:51.235070 master-0 kubenswrapper[23041]: I0308 00:31:51.235063 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.235125 master-0 kubenswrapper[23041]: I0308 00:31:51.235076 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-catalog-content\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:51.235125 master-0 kubenswrapper[23041]: I0308 00:31:51.235083 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:51.235426 master-0 kubenswrapper[23041]: I0308 00:31:51.235228 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-utilities\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:51.235426 master-0 kubenswrapper[23041]: E0308 00:31:51.235328 23041 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.235511 master-0 kubenswrapper[23041]: I0308 00:31:51.235483 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:31:51.235511 master-0 kubenswrapper[23041]: I0308 00:31:51.235494 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b100ce12-965e-409e-8cdb-8f99ef51a82b-serving-cert\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:51.235562 master-0 kubenswrapper[23041]: I0308 00:31:51.235537 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkh52\" (UniqueName: \"kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:31:51.235610 master-0 kubenswrapper[23041]: I0308 00:31:51.235585 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.235656 master-0 kubenswrapper[23041]: I0308 00:31:51.235635 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:51.235686 master-0 kubenswrapper[23041]: I0308 00:31:51.235667 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxt7\" (UniqueName: \"kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:51.235718 master-0 kubenswrapper[23041]: I0308 00:31:51.235695 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qshd\" (UniqueName: \"kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.235769 master-0 kubenswrapper[23041]: I0308 00:31:51.235724 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.235769 master-0 kubenswrapper[23041]: I0308 00:31:51.235753 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.235832 master-0 kubenswrapper[23041]: I0308 00:31:51.235783 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.235832 master-0 kubenswrapper[23041]: I0308 00:31:51.235811 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.235885 master-0 kubenswrapper[23041]: I0308 00:31:51.235837 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.235885 master-0 kubenswrapper[23041]: I0308 00:31:51.235864 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.235956 master-0 kubenswrapper[23041]: I0308 00:31:51.235891 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgqmb\" (UniqueName: \"kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:51.235956 master-0 kubenswrapper[23041]: I0308 00:31:51.235941 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.236018 master-0 kubenswrapper[23041]: I0308 00:31:51.236005 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:51.236107 master-0 kubenswrapper[23041]: I0308 00:31:51.236084 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.236180 master-0 kubenswrapper[23041]: I0308 00:31:51.236159 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0a3c84-98bb-4915-9010-d66fcbeafe09-config\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:51.236276 master-0 kubenswrapper[23041]: I0308 00:31:51.236222 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.236353 master-0 kubenswrapper[23041]: I0308 00:31:51.236331 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.236391 master-0 kubenswrapper[23041]: I0308 00:31:51.236342 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.236391 master-0 kubenswrapper[23041]: I0308 00:31:51.236375 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.236455 master-0 kubenswrapper[23041]: I0308 00:31:51.236399 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:31:51.236487 master-0 kubenswrapper[23041]: I0308 00:31:51.236467 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.236515 master-0 kubenswrapper[23041]: I0308 00:31:51.236506 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:51.236543 master-0 kubenswrapper[23041]: I0308 00:31:51.236523 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.236576 master-0 kubenswrapper[23041]: I0308 00:31:51.236555 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.236633 master-0 kubenswrapper[23041]: I0308 00:31:51.236603 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.236633 master-0 kubenswrapper[23041]: I0308 00:31:51.236629 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.236711 master-0 kubenswrapper[23041]: I0308 00:31:51.236692 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.236741 master-0 kubenswrapper[23041]: I0308 00:31:51.236720 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.236775 master-0 kubenswrapper[23041]: I0308 00:31:51.236745 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-multus-daemon-config\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.236843 master-0 kubenswrapper[23041]: I0308 00:31:51.236828 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.236877 master-0 kubenswrapper[23041]: I0308 00:31:51.236866 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:51.236907 master-0 kubenswrapper[23041]: I0308 00:31:51.236883 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.236907 master-0 kubenswrapper[23041]: I0308 00:31:51.236902 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.236963 master-0 kubenswrapper[23041]: I0308 00:31:51.236913 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.236963 master-0 kubenswrapper[23041]: I0308 00:31:51.236930 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.236963 master-0 kubenswrapper[23041]: I0308 00:31:51.236954 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237018 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237074 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237104 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237126 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6999cf38-e317-4727-98c9-d4e348e9e16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237232 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237280 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-env-overrides\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237308 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237370 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237411 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237478 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237509 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237705 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237708 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af391724-079a-4bac-a89e-978ffd471763-webhook-cert\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237748 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4n9\" (UniqueName: \"kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237768 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.237795 master-0 kubenswrapper[23041]: I0308 00:31:51.237792 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c8d406-5448-4056-ab3c-c8399217c024-utilities\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237828 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237850 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237869 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237889 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237908 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.237931 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238013 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238015 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-config\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238060 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238130 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238199 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238311 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238331 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238348 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238367 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238387 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238405 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238422 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238441 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238482 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238498 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238557 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggmz\" (UniqueName: \"kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238594 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238612 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238632 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238681 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238708 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238728 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238748 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238768 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238854 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-tmp\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238882 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238901 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238903 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238921 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.238982 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-tuned\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239062 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239078 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-env-overrides\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239094 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239110 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58333089-2456-4a25-8ba7-6d557eefa177-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239118 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239168 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239195 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9bq\" (UniqueName: \"kubernetes.io/projected/e884e46e-e520-4e0a-9f15-43d4b74af63e-kube-api-access-wg9bq\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239237 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239257 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239280 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9fl\" (UniqueName: \"kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl\") pod \"csi-snapshot-controller-7577d6f48-vd52m\" (UID: \"e97435ee-522e-427d-9efc-40bc3d2b0d02\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239355 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239374 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da68e85-9170-499d-8050-139ecfac4600-cni-binary-copy\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239387 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239428 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239458 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239480 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239500 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239522 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239542 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239563 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll99v\" (UniqueName: \"kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239582 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239658 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239708 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239764 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.239988 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-config\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240020 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240058 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240078 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240098 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240158 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240195 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.240799 master-0 kubenswrapper[23041]: I0308 00:31:51.240372 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d770808-d390-41c1-a9d9-fc12b99fa9a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241247 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241278 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241319 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241365 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241572 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-ca\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241706 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241767 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241801 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241844 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241889 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241930 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241950 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.241968 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242042 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242084 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242104 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242120 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242137 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d770808-d390-41c1-a9d9-fc12b99fa9a9-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242157 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242180 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242232 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvglb\" (UniqueName: \"kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb\") pod \"network-check-source-7c67b67d47-sctv9\" (UID: \"786e30f1-d30a-43e1-85cb-d8ea1495422e\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242459 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242481 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242501 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242520 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242541 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncncc\" (UniqueName: \"kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242559 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242576 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242593 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242610 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242628 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242645 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242673 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz874\" (UniqueName: \"kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242693 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.242710 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.243642 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.243781 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2548aca-4a9d-4670-a60a-0d6361d1c441-catalog-content\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.243858 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-tmpfs\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244042 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2b1a69b5-c946-495d-ae02-c56f788279e8-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244123 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-utilities\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244434 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac523956-c8a3-4794-a1fa-660cd14966bb-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244600 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e76bc134-2a88-4f92-9aa7-f6854941b98f-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244897 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245327 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.244898 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-snapshots\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245410 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4xz\" (UniqueName: \"kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz\") pod \"migrator-57ccdf9b5-tbcsh\" (UID: \"2ac55f03-dd6f-4ead-bacc-c69aeca146dc\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245431 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245457 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245482 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245501 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245570 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6999cf38-e317-4727-98c9-d4e348e9e16a-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245573 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245634 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2h6\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245655 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245708 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245719 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245727 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245768 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqn9\" (UniqueName: \"kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.245792 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246006 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3fee96d7-75a7-46e4-9707-7bd292f10b84-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246019 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ad37f40-c533-4a1e-882a-2e0973eff86d-srv-cert\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246259 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e76bc134-2a88-4f92-9aa7-f6854941b98f-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246294 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovnkube-script-lib\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246317 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246343 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246478 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246518 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246536 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246556 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246632 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246675 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246695 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246714 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246754 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246777 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246920 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246946 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.246967 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247010 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247046 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247067 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247089 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247127 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247146 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247167 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnrc\" (UniqueName: \"kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247233 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cf5a2ef-2498-40a0-a189-0753076fd3b6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247338 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllt8\" (UniqueName: \"kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247437 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247488 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247515 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58333089-2456-4a25-8ba7-6d557eefa177-serving-cert\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247553 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247584 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247638 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247663 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247682 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ce2ea7-bd25-4294-8f3a-11ce53577830-serving-cert\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247739 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247772 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-catalog-content\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247783 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-st8tx\" (UID: \"0d0cb126-341c-4215-ad2e-a008193cc0b5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247819 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5hl\" (UniqueName: \"kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247867 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljwf\" (UniqueName: \"kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247886 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247905 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247944 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247966 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.247946 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/db164b32-e20e-4d07-a9ae-98720321621d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248104 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248127 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248144 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248160 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248178 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248239 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/af391724-079a-4bac-a89e-978ffd471763-ovnkube-identity-cm\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248248 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248273 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248291 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248309 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248331 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248352 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248596 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-utilities\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248683 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248753 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248776 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248800 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248827 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248830 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248887 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9pm\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248920 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.248992 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:51.252200 master-0 kubenswrapper[23041]: I0308 00:31:51.249091 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249172 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249351 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-whereabouts-configmap\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249388 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249420 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249440 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249442 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b100ce12-965e-409e-8cdb-8f99ef51a82b-config\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249478 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249594 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249667 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-etcd-client\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.249720 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250032 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-serving-cert\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250090 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250121 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250224 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250281 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250318 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250545 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-trusted-ca\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250648 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250672 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250725 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250747 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250764 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250783 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250890 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e11e7e-6946-4e11-bce9-e91a721fe4a7-catalog-content\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250942 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxq9\" (UniqueName: \"kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.250984 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251007 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251176 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251345 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251386 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251412 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251440 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhx4\" (UniqueName: \"kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251492 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251516 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251542 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251587 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251620 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251644 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251674 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65c2\" (UniqueName: \"kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251749 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251773 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251789 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b94acad3-cf4e-443d-80fb-5e68a4074336-srv-cert\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251795 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251831 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251864 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.251900 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252012 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252595 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-ovn-node-metrics-cert\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252705 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252736 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252780 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.252959 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:51.256891 master-0 kubenswrapper[23041]: I0308 00:31:51.256371 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:31:51.266931 master-0 kubenswrapper[23041]: I0308 00:31:51.266857 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/815fd565-0609-4d8f-ac05-8656f198b008-metrics-certs\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:31:51.277272 master-0 kubenswrapper[23041]: I0308 00:31:51.276943 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:31:51.288157 master-0 kubenswrapper[23041]: I0308 00:31:51.288109 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-serving-ca\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.307806 master-0 kubenswrapper[23041]: I0308 00:31:51.302888 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:31:51.310779 master-0 kubenswrapper[23041]: I0308 00:31:51.310718 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-etcd-client\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.317035 master-0 kubenswrapper[23041]: I0308 00:31:51.316986 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:31:51.318401 master-0 kubenswrapper[23041]: I0308 00:31:51.318341 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-serving-cert\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.336115 master-0 kubenswrapper[23041]: I0308 00:31:51.336053 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:31:51.337374 master-0 kubenswrapper[23041]: I0308 00:31:51.337332 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/531e9339-968c-47bf-b8ea-c44d9ceef4b3-encryption-config\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.354436 master-0 kubenswrapper[23041]: I0308 00:31:51.354371 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.354610 master-0 kubenswrapper[23041]: I0308 00:31:51.354461 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.354610 master-0 kubenswrapper[23041]: I0308 00:31:51.354489 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9bq\" (UniqueName: \"kubernetes.io/projected/e884e46e-e520-4e0a-9f15-43d4b74af63e-kube-api-access-wg9bq\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:51.354610 master-0 kubenswrapper[23041]: I0308 00:31:51.354497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-wtmp\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.354724 master-0 kubenswrapper[23041]: I0308 00:31:51.354645 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.354724 master-0 kubenswrapper[23041]: I0308 00:31:51.354643 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-os-release\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.354724 master-0 kubenswrapper[23041]: I0308 00:31:51.354669 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.354724 master-0 kubenswrapper[23041]: I0308 00:31:51.354707 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-sys\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.354836 master-0 kubenswrapper[23041]: I0308 00:31:51.354807 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.354870 master-0 kubenswrapper[23041]: I0308 00:31:51.354839 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.354933 master-0 kubenswrapper[23041]: I0308 00:31:51.354913 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.354963 master-0 kubenswrapper[23041]: I0308 00:31:51.354943 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.354963 master-0 kubenswrapper[23041]: I0308 00:31:51.354955 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-root\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:51.355022 master-0 kubenswrapper[23041]: I0308 00:31:51.354994 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-systemd-units\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.355054 master-0 kubenswrapper[23041]: I0308 00:31:51.355003 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysconfig\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.355086 master-0 kubenswrapper[23041]: I0308 00:31:51.355059 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355124 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355148 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355173 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355225 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355250 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355262 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355400 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355525 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355616 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355643 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355672 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355739 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355750 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-node-pullsecrets\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355753 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-system-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355809 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355809 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-netd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355825 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355826 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355875 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355874 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-cnibin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355906 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.355913 master-0 kubenswrapper[23041]: I0308 00:31:51.355917 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356001 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356038 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-dir\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356034 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec2d22f2-c260-42a6-a9da-ee0f44f42303-host-etc-kube\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356048 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356105 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-cnibin\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356143 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356239 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356279 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356332 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356230 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356379 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356360 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-lib-modules\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356434 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.356524 master-0 kubenswrapper[23041]: I0308 00:31:51.356487 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.359214 master-0 kubenswrapper[23041]: I0308 00:31:51.359058 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:51.359214 master-0 kubenswrapper[23041]: I0308 00:31:51.359169 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.359214 master-0 kubenswrapper[23041]: I0308 00:31:51.359208 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.359351 master-0 kubenswrapper[23041]: I0308 00:31:51.359239 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.359351 master-0 kubenswrapper[23041]: I0308 00:31:51.359337 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-trusted-ca-bundle\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.359441 master-0 kubenswrapper[23041]: I0308 00:31:51.359407 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.359480 master-0 kubenswrapper[23041]: I0308 00:31:51.359410 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1751db13-b792-43e2-8459-d1d4a0164dfb-audit-dir\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:51.359480 master-0 kubenswrapper[23041]: I0308 00:31:51.359411 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-systemd\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.359480 master-0 kubenswrapper[23041]: I0308 00:31:51.359450 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.359480 master-0 kubenswrapper[23041]: I0308 00:31:51.359478 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.359585 master-0 kubenswrapper[23041]: I0308 00:31:51.359504 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-rootfs\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:51.359585 master-0 kubenswrapper[23041]: I0308 00:31:51.359518 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.359585 master-0 kubenswrapper[23041]: I0308 00:31:51.359545 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-sys\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.359585 master-0 kubenswrapper[23041]: I0308 00:31:51.359550 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.359689 master-0 kubenswrapper[23041]: I0308 00:31:51.359603 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.359770 master-0 kubenswrapper[23041]: I0308 00:31:51.359713 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.359811 master-0 kubenswrapper[23041]: I0308 00:31:51.359784 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.359891 master-0 kubenswrapper[23041]: I0308 00:31:51.359853 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.359961 master-0 kubenswrapper[23041]: I0308 00:31:51.359890 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-cni-bin\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.359961 master-0 kubenswrapper[23041]: I0308 00:31:51.359941 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.360029 master-0 kubenswrapper[23041]: I0308 00:31:51.359959 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-slash\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.360029 master-0 kubenswrapper[23041]: I0308 00:31:51.360006 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.360029 master-0 kubenswrapper[23041]: I0308 00:31:51.360023 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.360130 master-0 kubenswrapper[23041]: I0308 00:31:51.360042 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-k8s-cni-cncf-io\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.360130 master-0 kubenswrapper[23041]: I0308 00:31:51.360080 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.360130 master-0 kubenswrapper[23041]: I0308 00:31:51.360085 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.360130 master-0 kubenswrapper[23041]: I0308 00:31:51.360119 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.360273 master-0 kubenswrapper[23041]: I0308 00:31:51.360224 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.360392 master-0 kubenswrapper[23041]: I0308 00:31:51.360354 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.360444 master-0 kubenswrapper[23041]: I0308 00:31:51.360389 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.360444 master-0 kubenswrapper[23041]: I0308 00:31:51.360422 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.360552 master-0 kubenswrapper[23041]: I0308 00:31:51.360502 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.360590 master-0 kubenswrapper[23041]: I0308 00:31:51.360557 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/d01c21a1-6c2c-49a7-9d85-254662851838-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:51.360590 master-0 kubenswrapper[23041]: I0308 00:31:51.360570 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.360663 master-0 kubenswrapper[23041]: I0308 00:31:51.360641 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.360703 master-0 kubenswrapper[23041]: I0308 00:31:51.360678 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360773 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360797 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-kubernetes\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360821 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360886 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360905 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360941 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360967 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.360990 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.361019 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-systemd\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.361051 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-netns\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.361072 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.361107 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.361124 master-0 kubenswrapper[23041]: I0308 00:31:51.361114 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-node-log\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361149 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361187 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361232 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361262 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361295 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361345 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361365 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361383 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-var-lib-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361386 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-conf\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361390 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361424 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361436 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361462 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361470 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361482 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-conf-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361465 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361490 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.361523 master-0 kubenswrapper[23041]: I0308 00:31:51.361532 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361561 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361584 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-sysctl-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361613 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-hostroot\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361648 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361684 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-system-cni-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361697 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361760 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5a229b84-65bd-493b-90dd-b8194f842dc8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361794 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-audit-log\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361877 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361925 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.361993 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jd2n\" (UniqueName: \"kubernetes.io/projected/b22c3046-5193-4c1d-91c0-7c15745265be-kube-api-access-2jd2n\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:51.362124 master-0 kubenswrapper[23041]: I0308 00:31:51.362079 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-hosts-file\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:31:51.362473 master-0 kubenswrapper[23041]: I0308 00:31:51.362281 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.362473 master-0 kubenswrapper[23041]: I0308 00:31:51.362309 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-run\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.362473 master-0 kubenswrapper[23041]: I0308 00:31:51.362312 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.362473 master-0 kubenswrapper[23041]: I0308 00:31:51.362393 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-audit-log\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.362473 master-0 kubenswrapper[23041]: I0308 00:31:51.362435 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:31:51.362643 master-0 kubenswrapper[23041]: I0308 00:31:51.362336 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-netns\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.362916 master-0 kubenswrapper[23041]: I0308 00:31:51.362840 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.363012 master-0 kubenswrapper[23041]: I0308 00:31:51.362919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.363012 master-0 kubenswrapper[23041]: I0308 00:31:51.363003 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc78c\" (UniqueName: \"kubernetes.io/projected/795e6115-95cc-4c0a-a407-e0a6f14118e5-kube-api-access-kc78c\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.363216 master-0 kubenswrapper[23041]: I0308 00:31:51.363169 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.363279 master-0 kubenswrapper[23041]: I0308 00:31:51.363182 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.363367 master-0 kubenswrapper[23041]: I0308 00:31:51.363345 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.363418 master-0 kubenswrapper[23041]: I0308 00:31:51.363394 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-cni-dir\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.363518 master-0 kubenswrapper[23041]: I0308 00:31:51.363493 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.363567 master-0 kubenswrapper[23041]: I0308 00:31:51.363532 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.363654 master-0 kubenswrapper[23041]: I0308 00:31:51.363629 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-log-socket\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.363687 master-0 kubenswrapper[23041]: I0308 00:31:51.363677 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.363728 master-0 kubenswrapper[23041]: I0308 00:31:51.363709 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.363782 master-0 kubenswrapper[23041]: I0308 00:31:51.363764 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.363852 master-0 kubenswrapper[23041]: I0308 00:31:51.363824 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.363911 master-0 kubenswrapper[23041]: I0308 00:31:51.363892 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.363940 master-0 kubenswrapper[23041]: I0308 00:31:51.363927 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4t5k\" (UniqueName: \"kubernetes.io/projected/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-kube-api-access-r4t5k\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.363977 master-0 kubenswrapper[23041]: I0308 00:31:51.363956 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.364027 master-0 kubenswrapper[23041]: I0308 00:31:51.364009 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.364106 master-0 kubenswrapper[23041]: I0308 00:31:51.364089 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.364150 master-0 kubenswrapper[23041]: I0308 00:31:51.364130 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.364264 master-0 kubenswrapper[23041]: I0308 00:31:51.364242 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.364301 master-0 kubenswrapper[23041]: I0308 00:31:51.364289 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.364329 master-0 kubenswrapper[23041]: I0308 00:31:51.364317 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.364361 master-0 kubenswrapper[23041]: I0308 00:31:51.364343 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.364405 master-0 kubenswrapper[23041]: I0308 00:31:51.364387 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:51.364479 master-0 kubenswrapper[23041]: I0308 00:31:51.364460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.364508 master-0 kubenswrapper[23041]: I0308 00:31:51.364489 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.364536 master-0 kubenswrapper[23041]: I0308 00:31:51.364519 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert\") pod \"monitoring-plugin-6db79546f6-gdz4k\" (UID: \"2040e5dc-b314-46a9-a61b-e80f1a046ce3\") " pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:51.364602 master-0 kubenswrapper[23041]: I0308 00:31:51.364584 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.364677 master-0 kubenswrapper[23041]: I0308 00:31:51.364659 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.364718 master-0 kubenswrapper[23041]: I0308 00:31:51.364697 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.364746 master-0 kubenswrapper[23041]: I0308 00:31:51.364728 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.364787 master-0 kubenswrapper[23041]: I0308 00:31:51.364770 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:51.364822 master-0 kubenswrapper[23041]: I0308 00:31:51.364798 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.364850 master-0 kubenswrapper[23041]: I0308 00:31:51.364821 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.364893 master-0 kubenswrapper[23041]: I0308 00:31:51.364876 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.364925 master-0 kubenswrapper[23041]: I0308 00:31:51.364905 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.364953 master-0 kubenswrapper[23041]: I0308 00:31:51.364935 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365063 master-0 kubenswrapper[23041]: I0308 00:31:51.365045 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.365131 master-0 kubenswrapper[23041]: I0308 00:31:51.365113 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365283 master-0 kubenswrapper[23041]: I0308 00:31:51.365262 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-kubelet\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365336 master-0 kubenswrapper[23041]: I0308 00:31:51.365320 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-var-lib-kubelet\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.365386 master-0 kubenswrapper[23041]: I0308 00:31:51.365369 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365433 master-0 kubenswrapper[23041]: I0308 00:31:51.365416 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-multus\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365463 master-0 kubenswrapper[23041]: I0308 00:31:51.365408 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.365493 master-0 kubenswrapper[23041]: I0308 00:31:51.365466 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.365521 master-0 kubenswrapper[23041]: I0308 00:31:51.365489 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-etc-kubernetes\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365556 master-0 kubenswrapper[23041]: I0308 00:31:51.365502 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.365587 master-0 kubenswrapper[23041]: I0308 00:31:51.365555 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.365587 master-0 kubenswrapper[23041]: I0308 00:31:51.365567 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-host\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.365646 master-0 kubenswrapper[23041]: I0308 00:31:51.365622 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:51.365646 master-0 kubenswrapper[23041]: I0308 00:31:51.365632 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-etc-openvswitch\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365705 master-0 kubenswrapper[23041]: I0308 00:31:51.365671 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-host-kubelet\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365705 master-0 kubenswrapper[23041]: I0308 00:31:51.365697 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-os-release\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.365762 master-0 kubenswrapper[23041]: I0308 00:31:51.365706 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e52cbdc-1d46-4cc9-85ee-535aa449992f-host-slash\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.365762 master-0 kubenswrapper[23041]: I0308 00:31:51.365733 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:51.365818 master-0 kubenswrapper[23041]: I0308 00:31:51.365760 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-run-multus-certs\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365818 master-0 kubenswrapper[23041]: I0308 00:31:51.365810 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:51.365872 master-0 kubenswrapper[23041]: I0308 00:31:51.365815 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-host-var-lib-cni-bin\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365872 master-0 kubenswrapper[23041]: I0308 00:31:51.365832 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-run-ovn\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:51.365872 master-0 kubenswrapper[23041]: I0308 00:31:51.365861 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.365945 master-0 kubenswrapper[23041]: I0308 00:31:51.365883 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7da68e85-9170-499d-8050-139ecfac4600-multus-socket-dir-parent\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:51.365945 master-0 kubenswrapper[23041]: I0308 00:31:51.365932 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:51.366080 master-0 kubenswrapper[23041]: I0308 00:31:51.366057 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/401bbef2-684c-4f55-b2c7-e6184c789e40-etc-modprobe-d\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:51.370770 master-0 kubenswrapper[23041]: I0308 00:31:51.370708 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.385487 master-0 kubenswrapper[23041]: I0308 00:31:51.385422 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:31:51.396281 master-0 kubenswrapper[23041]: I0308 00:31:51.396145 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:31:51.416470 master-0 kubenswrapper[23041]: I0308 00:31:51.416413 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:31:51.437255 master-0 kubenswrapper[23041]: I0308 00:31:51.437173 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:31:51.441451 master-0 kubenswrapper[23041]: I0308 00:31:51.440635 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/531e9339-968c-47bf-b8ea-c44d9ceef4b3-audit-policies\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:51.446249 master-0 kubenswrapper[23041]: I0308 00:31:51.443139 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.446249 master-0 kubenswrapper[23041]: I0308 00:31:51.443286 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.446249 master-0 kubenswrapper[23041]: I0308 00:31:51.443494 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.446249 master-0 kubenswrapper[23041]: I0308 00:31:51.443607 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.446249 master-0 kubenswrapper[23041]: I0308 00:31:51.444098 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.448299 master-0 kubenswrapper[23041]: I0308 00:31:51.448273 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.448618 master-0 kubenswrapper[23041]: I0308 00:31:51.448596 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:51.457146 master-0 kubenswrapper[23041]: I0308 00:31:51.457092 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:31:51.460591 master-0 kubenswrapper[23041]: I0308 00:31:51.460556 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 00:31:51.465249 master-0 kubenswrapper[23041]: I0308 00:31:51.465188 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.466748 master-0 kubenswrapper[23041]: I0308 00:31:51.466688 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert\") pod \"monitoring-plugin-6db79546f6-gdz4k\" (UID: \"2040e5dc-b314-46a9-a61b-e80f1a046ce3\") " pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:51.479565 master-0 kubenswrapper[23041]: I0308 00:31:51.479506 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:31:51.487943 master-0 kubenswrapper[23041]: I0308 00:31:51.487857 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.496352 master-0 kubenswrapper[23041]: I0308 00:31:51.496321 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:31:51.499861 master-0 kubenswrapper[23041]: I0308 00:31:51.499827 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.523333 master-0 kubenswrapper[23041]: I0308 00:31:51.523227 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:31:51.529826 master-0 kubenswrapper[23041]: I0308 00:31:51.529785 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:51.534749 master-0 kubenswrapper[23041]: I0308 00:31:51.534708 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.535874 master-0 kubenswrapper[23041]: I0308 00:31:51.535839 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:31:51.536583 master-0 kubenswrapper[23041]: I0308 00:31:51.536559 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.536643 master-0 kubenswrapper[23041]: I0308 00:31:51.536594 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.536643 master-0 kubenswrapper[23041]: I0308 00:31:51.536613 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.536706 master-0 kubenswrapper[23041]: I0308 00:31:51.536648 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.538592 master-0 kubenswrapper[23041]: I0308 00:31:51.538568 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:31:51.541517 master-0 kubenswrapper[23041]: I0308 00:31:51.541491 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.545490 master-0 kubenswrapper[23041]: I0308 00:31:51.545461 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:31:51.556981 master-0 kubenswrapper[23041]: I0308 00:31:51.556918 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:31:51.559497 master-0 kubenswrapper[23041]: I0308 00:31:51.559462 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e52cbdc-1d46-4cc9-85ee-535aa449992f-iptables-alerter-script\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:51.577139 master-0 kubenswrapper[23041]: I0308 00:31:51.577070 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:31:51.596860 master-0 kubenswrapper[23041]: I0308 00:31:51.596813 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:31:51.603538 master-0 kubenswrapper[23041]: I0308 00:31:51.603502 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e302bc0b-7560-4f84-813f-d966c2dbe47c-metrics-tls\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:51.616798 master-0 kubenswrapper[23041]: I0308 00:31:51.616758 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:31:51.627908 master-0 kubenswrapper[23041]: I0308 00:31:51.627849 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e302bc0b-7560-4f84-813f-d966c2dbe47c-config-volume\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:51.636879 master-0 kubenswrapper[23041]: I0308 00:31:51.636826 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:31:51.657864 master-0 kubenswrapper[23041]: I0308 00:31:51.657754 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:31:51.677133 master-0 kubenswrapper[23041]: I0308 00:31:51.677085 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 00:31:51.678860 master-0 kubenswrapper[23041]: I0308 00:31:51.678835 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0d0cb126-341c-4215-ad2e-a008193cc0b5-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-st8tx\" (UID: \"0d0cb126-341c-4215-ad2e-a008193cc0b5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:31:51.697956 master-0 kubenswrapper[23041]: I0308 00:31:51.697901 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:31:51.718829 master-0 kubenswrapper[23041]: I0308 00:31:51.718774 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:31:51.729380 master-0 kubenswrapper[23041]: I0308 00:31:51.729323 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-default-certificate\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.737460 master-0 kubenswrapper[23041]: I0308 00:31:51.737418 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:31:51.745176 master-0 kubenswrapper[23041]: I0308 00:31:51.745121 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:31:51.746288 master-0 kubenswrapper[23041]: I0308 00:31:51.746248 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-stats-auth\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.749573 master-0 kubenswrapper[23041]: I0308 00:31:51.749548 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-st8tx" Mar 08 00:31:51.756302 master-0 kubenswrapper[23041]: I0308 00:31:51.756261 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:31:51.758253 master-0 kubenswrapper[23041]: I0308 00:31:51.758195 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-service-ca-bundle\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:51.776736 master-0 kubenswrapper[23041]: I0308 00:31:51.776699 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:31:51.796274 master-0 kubenswrapper[23041]: I0308 00:31:51.796228 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:31:51.804768 master-0 kubenswrapper[23041]: I0308 00:31:51.804732 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a229b84-65bd-493b-90dd-b8194f842dc8-serving-cert\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.816787 master-0 kubenswrapper[23041]: I0308 00:31:51.816748 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:31:51.818771 master-0 kubenswrapper[23041]: I0308 00:31:51.818688 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a229b84-65bd-493b-90dd-b8194f842dc8-service-ca\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:51.836177 master-0 kubenswrapper[23041]: I0308 00:31:51.836148 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:31:51.839678 master-0 kubenswrapper[23041]: I0308 00:31:51.839629 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.857728 master-0 kubenswrapper[23041]: I0308 00:31:51.857664 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:31:51.860276 master-0 kubenswrapper[23041]: I0308 00:31:51.860228 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.876602 master-0 kubenswrapper[23041]: I0308 00:31:51.876540 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:31:51.898691 master-0 kubenswrapper[23041]: I0308 00:31:51.898622 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:31:51.916508 master-0 kubenswrapper[23041]: I0308 00:31:51.916394 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:31:51.922343 master-0 kubenswrapper[23041]: I0308 00:31:51.922276 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:51.936011 master-0 kubenswrapper[23041]: I0308 00:31:51.935953 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-8z76k" Mar 08 00:31:51.956455 master-0 kubenswrapper[23041]: I0308 00:31:51.956414 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 00:31:51.976035 master-0 kubenswrapper[23041]: I0308 00:31:51.975983 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:31:51.977119 master-0 kubenswrapper[23041]: I0308 00:31:51.977077 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-metrics-certs\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:52.002685 master-0 kubenswrapper[23041]: I0308 00:31:52.002629 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 00:31:52.016529 master-0 kubenswrapper[23041]: I0308 00:31:52.016477 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 00:31:52.024111 master-0 kubenswrapper[23041]: I0308 00:31:52.023854 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:52.035834 master-0 kubenswrapper[23041]: I0308 00:31:52.035782 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:31:52.056475 master-0 kubenswrapper[23041]: I0308 00:31:52.056396 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-g5h9b" Mar 08 00:31:52.076314 master-0 kubenswrapper[23041]: I0308 00:31:52.076266 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 00:31:52.079520 master-0 kubenswrapper[23041]: I0308 00:31:52.079462 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d2e1686-3a30-4021-9c03-02e472bc6ff3-cert\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:52.097992 master-0 kubenswrapper[23041]: I0308 00:31:52.097930 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 00:31:52.099158 master-0 kubenswrapper[23041]: I0308 00:31:52.099117 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f496486-70d5-4c5c-b4f3-6cc19427762f-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:31:52.125551 master-0 kubenswrapper[23041]: I0308 00:31:52.125481 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 00:31:52.126400 master-0 kubenswrapper[23041]: I0308 00:31:52.126372 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d2e1686-3a30-4021-9c03-02e472bc6ff3-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:52.137269 master-0 kubenswrapper[23041]: I0308 00:31:52.137224 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:31:52.155010 master-0 kubenswrapper[23041]: I0308 00:31:52.154665 23041 request.go:700] Waited for 1.00390017s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/secrets?fieldSelector=metadata.name%3Dcloud-credential-operator-dockercfg-b4z2l&limit=500&resourceVersion=0 Mar 08 00:31:52.156257 master-0 kubenswrapper[23041]: I0308 00:31:52.156183 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-b4z2l" Mar 08 00:31:52.166361 master-0 kubenswrapper[23041]: I0308 00:31:52.166312 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:52.167002 master-0 kubenswrapper[23041]: I0308 00:31:52.166883 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" containerName="metrics-server" containerID="cri-o://d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c" gracePeriod=170 Mar 08 00:31:52.177366 master-0 kubenswrapper[23041]: I0308 00:31:52.177296 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 00:31:52.179539 master-0 kubenswrapper[23041]: I0308 00:31:52.179480 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:52.196574 master-0 kubenswrapper[23041]: I0308 00:31:52.196532 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 00:31:52.204221 master-0 kubenswrapper[23041]: I0308 00:31:52.204155 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78057cd-5120-4a12-934d-9fed51e1bdc0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:52.221593 master-0 kubenswrapper[23041]: I0308 00:31:52.221509 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 00:31:52.223847 master-0 kubenswrapper[23041]: I0308 00:31:52.223808 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e78057cd-5120-4a12-934d-9fed51e1bdc0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:52.227856 master-0 kubenswrapper[23041]: E0308 00:31:52.227738 23041 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.227963 master-0 kubenswrapper[23041]: E0308 00:31:52.227888 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls podName:7317ceda-df6f-4826-aa1a-15304c2b0fcd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.72785351 +0000 UTC m=+18.200690094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls") pod "machine-approver-754bdc9f9d-xpl2b" (UID: "7317ceda-df6f-4826-aa1a-15304c2b0fcd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.228038 master-0 kubenswrapper[23041]: E0308 00:31:52.227970 23041 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.228038 master-0 kubenswrapper[23041]: E0308 00:31:52.228019 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config podName:3b4f8517-1e54-4b41-ba6b-6c56fe66831a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.728009624 +0000 UTC m=+18.200846178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" (UID: "3b4f8517-1e54-4b41-ba6b-6c56fe66831a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.228263 master-0 kubenswrapper[23041]: E0308 00:31:52.228091 23041 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.228263 master-0 kubenswrapper[23041]: E0308 00:31:52.228221 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config podName:7317ceda-df6f-4826-aa1a-15304c2b0fcd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.728178218 +0000 UTC m=+18.201014822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config") pod "machine-approver-754bdc9f9d-xpl2b" (UID: "7317ceda-df6f-4826-aa1a-15304c2b0fcd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.228941 master-0 kubenswrapper[23041]: E0308 00:31:52.228892 23041 secret.go:189] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.229032 master-0 kubenswrapper[23041]: E0308 00:31:52.228964 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.728944477 +0000 UTC m=+18.201781041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231255 master-0 kubenswrapper[23041]: E0308 00:31:52.231225 23041 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231330 master-0 kubenswrapper[23041]: E0308 00:31:52.231273 23041 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231330 master-0 kubenswrapper[23041]: E0308 00:31:52.231304 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token podName:a68ad726-392e-4a7a-a384-409108df9c8b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.731285953 +0000 UTC m=+18.204122507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token") pod "machine-config-server-wkt98" (UID: "a68ad726-392e-4a7a-a384-409108df9c8b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231330 master-0 kubenswrapper[23041]: E0308 00:31:52.231304 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231486 master-0 kubenswrapper[23041]: E0308 00:31:52.231336 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls podName:ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.731320904 +0000 UTC m=+18.204157498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls") pod "machine-config-daemon-k7pnc" (UID: "ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.231486 master-0 kubenswrapper[23041]: E0308 00:31:52.231373 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.731356965 +0000 UTC m=+18.204193569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.233619 master-0 kubenswrapper[23041]: E0308 00:31:52.233596 23041 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.233678 master-0 kubenswrapper[23041]: E0308 00:31:52.233643 23041 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.233678 master-0 kubenswrapper[23041]: E0308 00:31:52.233668 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.233741 master-0 kubenswrapper[23041]: E0308 00:31:52.233689 23041 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.233741 master-0 kubenswrapper[23041]: E0308 00:31:52.233649 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config podName:2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.73363714 +0000 UTC m=+18.206473704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config") pod "machine-config-controller-ff46b7bdf-z5fkp" (UID: "2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.233884 master-0 kubenswrapper[23041]: E0308 00:31:52.233749 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.733730522 +0000 UTC m=+18.206567076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.233884 master-0 kubenswrapper[23041]: E0308 00:31:52.233765 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls podName:2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.733756823 +0000 UTC m=+18.206593377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls") pod "machine-config-controller-ff46b7bdf-z5fkp" (UID: "2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.233884 master-0 kubenswrapper[23041]: E0308 00:31:52.233776 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls podName:24ef1fb7-c8a1-4b50-b89f-2a81848ebb25 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.733771763 +0000 UTC m=+18.206608317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls") pod "node-exporter-bx9dn" (UID: "24ef1fb7-c8a1-4b50-b89f-2a81848ebb25") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.234901 master-0 kubenswrapper[23041]: E0308 00:31:52.234866 23041 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.234954 master-0 kubenswrapper[23041]: E0308 00:31:52.234905 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs podName:1da0c222-4b59-424f-9817-48673083df00 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.73489626 +0000 UTC m=+18.207732814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs") pod "multus-admission-controller-7769569c45-5n69x" (UID: "1da0c222-4b59-424f-9817-48673083df00") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.235029 master-0 kubenswrapper[23041]: E0308 00:31:52.235005 23041 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.235060 master-0 kubenswrapper[23041]: E0308 00:31:52.235047 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs podName:d01c21a1-6c2c-49a7-9d85-254662851838 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.735037873 +0000 UTC m=+18.207874427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-w2q2q" (UID: "d01c21a1-6c2c-49a7-9d85-254662851838") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.235103 master-0 kubenswrapper[23041]: E0308 00:31:52.235082 23041 projected.go:288] Couldn't get configMap openshift-catalogd/catalogd-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.236100 master-0 kubenswrapper[23041]: E0308 00:31:52.236071 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.236177 master-0 kubenswrapper[23041]: E0308 00:31:52.236128 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca podName:5b9f4db1-3ba9-49a5-9a65-1d770ee59a65 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.736118389 +0000 UTC m=+18.208954943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca") pod "openshift-state-metrics-74cc79fd76-s9b9v" (UID: "5b9f4db1-3ba9-49a5-9a65-1d770ee59a65") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.236379 master-0 kubenswrapper[23041]: I0308 00:31:52.236350 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 00:31:52.236606 master-0 kubenswrapper[23041]: E0308 00:31:52.236581 23041 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.236651 master-0 kubenswrapper[23041]: E0308 00:31:52.236621 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.736612811 +0000 UTC m=+18.209449365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.236939 master-0 kubenswrapper[23041]: E0308 00:31:52.236909 23041 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.237031 master-0 kubenswrapper[23041]: E0308 00:31:52.236945 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.736937689 +0000 UTC m=+18.209774243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.237031 master-0 kubenswrapper[23041]: E0308 00:31:52.236947 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.237031 master-0 kubenswrapper[23041]: E0308 00:31:52.237005 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.73699006 +0000 UTC m=+18.209826624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.237031 master-0 kubenswrapper[23041]: E0308 00:31:52.237015 23041 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.237149 master-0 kubenswrapper[23041]: E0308 00:31:52.237051 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls podName:9d810f7f-258a-47ce-9f99-7b1d93388aee nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.737037861 +0000 UTC m=+18.209874415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls") pod "machine-config-operator-fdb5c78b5-5nbfk" (UID: "9d810f7f-258a-47ce-9f99-7b1d93388aee") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.237944 master-0 kubenswrapper[23041]: E0308 00:31:52.237902 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.237944 master-0 kubenswrapper[23041]: E0308 00:31:52.237921 23041 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.237906 23041 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.237968 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca podName:24ef1fb7-c8a1-4b50-b89f-2a81848ebb25 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.737952993 +0000 UTC m=+18.210789567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca") pod "node-exporter-bx9dn" (UID: "24ef1fb7-c8a1-4b50-b89f-2a81848ebb25") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.237993 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.737979844 +0000 UTC m=+18.210816398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.238009 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config podName:ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.738003655 +0000 UTC m=+18.210840209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config") pod "machine-config-daemon-k7pnc" (UID: "ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.237902 23041 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.238053 master-0 kubenswrapper[23041]: E0308 00:31:52.238033 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.738028425 +0000 UTC m=+18.210864979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.238324 master-0 kubenswrapper[23041]: E0308 00:31:52.238165 23041 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.238324 master-0 kubenswrapper[23041]: E0308 00:31:52.238227 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls podName:c7097f64-1709-4f76-a725-5a6c6cc5919b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.73821779 +0000 UTC m=+18.211054344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-bncfj" (UID: "c7097f64-1709-4f76-a725-5a6c6cc5919b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.239337 master-0 kubenswrapper[23041]: E0308 00:31:52.239308 23041 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239433 master-0 kubenswrapper[23041]: E0308 00:31:52.239377 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739361227 +0000 UTC m=+18.212197831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239516 master-0 kubenswrapper[23041]: E0308 00:31:52.239430 23041 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239516 master-0 kubenswrapper[23041]: E0308 00:31:52.239477 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images podName:3b4f8517-1e54-4b41-ba6b-6c56fe66831a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.73946426 +0000 UTC m=+18.212300884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" (UID: "3b4f8517-1e54-4b41-ba6b-6c56fe66831a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239516 master-0 kubenswrapper[23041]: E0308 00:31:52.239499 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239518 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239553 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca podName:e237ed52-5561-44c5-bcb1-de62691d6431 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739539032 +0000 UTC m=+18.212375656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca") pod "prometheus-operator-5ff8674d55-qxpv9" (UID: "e237ed52-5561-44c5-bcb1-de62691d6431") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239556 23041 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239575 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739564612 +0000 UTC m=+18.212401266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239574 23041 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239577 23041 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239620 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images podName:84522c03-fd7b-4be7-9413-84e510b9dc5a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739604163 +0000 UTC m=+18.212440797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images") pod "cluster-baremetal-operator-5cdb4c5598-qldx6" (UID: "84522c03-fd7b-4be7-9413-84e510b9dc5a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239660 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images podName:c7097f64-1709-4f76-a725-5a6c6cc5919b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739646484 +0000 UTC m=+18.212483108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images") pod "machine-api-operator-84bf6db4f9-bncfj" (UID: "c7097f64-1709-4f76-a725-5a6c6cc5919b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.239785 master-0 kubenswrapper[23041]: E0308 00:31:52.239682 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls podName:5b9f4db1-3ba9-49a5-9a65-1d770ee59a65 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.739671555 +0000 UTC m=+18.212508199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls") pod "openshift-state-metrics-74cc79fd76-s9b9v" (UID: "5b9f4db1-3ba9-49a5-9a65-1d770ee59a65") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240421 master-0 kubenswrapper[23041]: E0308 00:31:52.240387 23041 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240495 master-0 kubenswrapper[23041]: E0308 00:31:52.240444 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert podName:d70f4efb-e61a-4e88-a271-2f4af21ecdf3 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.740428413 +0000 UTC m=+18.213264987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert") pod "packageserver-9c44c86f9-rplwv" (UID: "d70f4efb-e61a-4e88-a271-2f4af21ecdf3") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240495 master-0 kubenswrapper[23041]: E0308 00:31:52.240478 23041 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.240613 master-0 kubenswrapper[23041]: E0308 00:31:52.240507 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle podName:614f0a0f-5853-4cf6-bd3d-174141f0f1e2 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.740498685 +0000 UTC m=+18.213335259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle") pod "insights-operator-8f89dfddd-brq9l" (UID: "614f0a0f-5853-4cf6-bd3d-174141f0f1e2") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.240613 master-0 kubenswrapper[23041]: E0308 00:31:52.240526 23041 secret.go:189] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240613 master-0 kubenswrapper[23041]: E0308 00:31:52.240553 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key podName:4f5539c1-fb87-42d6-b735-6de53421bb6b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.740546276 +0000 UTC m=+18.213382840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key") pod "service-ca-84bfdbbb7f-bc2m2" (UID: "4f5539c1-fb87-42d6-b735-6de53421bb6b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240613 master-0 kubenswrapper[23041]: E0308 00:31:52.240600 23041 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.240832 master-0 kubenswrapper[23041]: E0308 00:31:52.240629 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.740619998 +0000 UTC m=+18.213456572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.241479 master-0 kubenswrapper[23041]: E0308 00:31:52.241442 23041 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.241539 master-0 kubenswrapper[23041]: E0308 00:31:52.241514 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls podName:e3f42081-387d-4798-b981-ac232e851bb4 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.741494579 +0000 UTC m=+18.214331153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-8lf4q" (UID: "e3f42081-387d-4798-b981-ac232e851bb4") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.245301 master-0 kubenswrapper[23041]: E0308 00:31:52.245278 23041 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.245410 master-0 kubenswrapper[23041]: E0308 00:31:52.245329 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config podName:c7097f64-1709-4f76-a725-5a6c6cc5919b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.745319001 +0000 UTC m=+18.218155555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config") pod "machine-api-operator-84bf6db4f9-bncfj" (UID: "c7097f64-1709-4f76-a725-5a6c6cc5919b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.245410 master-0 kubenswrapper[23041]: E0308 00:31:52.245356 23041 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.245410 master-0 kubenswrapper[23041]: E0308 00:31:52.245386 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config podName:7317ceda-df6f-4826-aa1a-15304c2b0fcd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.745380502 +0000 UTC m=+18.218217056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config") pod "machine-approver-754bdc9f9d-xpl2b" (UID: "7317ceda-df6f-4826-aa1a-15304c2b0fcd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.245410 master-0 kubenswrapper[23041]: E0308 00:31:52.245407 23041 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.245410 master-0 kubenswrapper[23041]: E0308 00:31:52.245426 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle podName:4f5539c1-fb87-42d6-b735-6de53421bb6b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.745421123 +0000 UTC m=+18.218257677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle") pod "service-ca-84bfdbbb7f-bc2m2" (UID: "4f5539c1-fb87-42d6-b735-6de53421bb6b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246426 master-0 kubenswrapper[23041]: E0308 00:31:52.246393 23041 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246498 master-0 kubenswrapper[23041]: E0308 00:31:52.246440 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246498 master-0 kubenswrapper[23041]: E0308 00:31:52.246473 23041 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246498 master-0 kubenswrapper[23041]: E0308 00:31:52.246475 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246502 23041 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246445 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config podName:9d810f7f-258a-47ce-9f99-7b1d93388aee nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.746437238 +0000 UTC m=+18.219273792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-5nbfk" (UID: "9d810f7f-258a-47ce-9f99-7b1d93388aee") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246540 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.74653221 +0000 UTC m=+18.219368764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246551 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle podName:614f0a0f-5853-4cf6-bd3d-174141f0f1e2 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.74654663 +0000 UTC m=+18.219383184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle") pod "insights-operator-8f89dfddd-brq9l" (UID: "614f0a0f-5853-4cf6-bd3d-174141f0f1e2") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246563 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.746558351 +0000 UTC m=+18.219394905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.246615 master-0 kubenswrapper[23041]: E0308 00:31:52.246575 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.746569661 +0000 UTC m=+18.219406215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.248137 master-0 kubenswrapper[23041]: E0308 00:31:52.248117 23041 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248148 23041 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248166 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert podName:d70f4efb-e61a-4e88-a271-2f4af21ecdf3 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.748152099 +0000 UTC m=+18.220988673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert") pod "packageserver-9c44c86f9-rplwv" (UID: "d70f4efb-e61a-4e88-a271-2f4af21ecdf3") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248164 23041 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248182 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.748174939 +0000 UTC m=+18.221011493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248189 23041 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.248325 master-0 kubenswrapper[23041]: E0308 00:31:52.248256 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images podName:9d810f7f-258a-47ce-9f99-7b1d93388aee nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.74820279 +0000 UTC m=+18.221039344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images") pod "machine-config-operator-fdb5c78b5-5nbfk" (UID: "9d810f7f-258a-47ce-9f99-7b1d93388aee") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.248551 master-0 kubenswrapper[23041]: E0308 00:31:52.248372 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config podName:e237ed52-5561-44c5-bcb1-de62691d6431 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.748327153 +0000 UTC m=+18.221163767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-5ff8674d55-qxpv9" (UID: "e237ed52-5561-44c5-bcb1-de62691d6431") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.249431 master-0 kubenswrapper[23041]: E0308 00:31:52.249399 23041 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.249499 master-0 kubenswrapper[23041]: E0308 00:31:52.249430 23041 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.249499 master-0 kubenswrapper[23041]: E0308 00:31:52.249445 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config podName:84522c03-fd7b-4be7-9413-84e510b9dc5a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.74943549 +0000 UTC m=+18.222272044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config") pod "cluster-baremetal-operator-5cdb4c5598-qldx6" (UID: "84522c03-fd7b-4be7-9413-84e510b9dc5a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.249499 master-0 kubenswrapper[23041]: E0308 00:31:52.249468 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.249620 master-0 kubenswrapper[23041]: E0308 00:31:52.249508 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs podName:a68ad726-392e-4a7a-a384-409108df9c8b nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.749496051 +0000 UTC m=+18.222332615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs") pod "machine-config-server-wkt98" (UID: "a68ad726-392e-4a7a-a384-409108df9c8b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.249620 master-0 kubenswrapper[23041]: E0308 00:31:52.249530 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.749521472 +0000 UTC m=+18.222358036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251723 master-0 kubenswrapper[23041]: E0308 00:31:52.251686 23041 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251723 master-0 kubenswrapper[23041]: E0308 00:31:52.251709 23041 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251740 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert podName:614f0a0f-5853-4cf6-bd3d-174141f0f1e2 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.751729125 +0000 UTC m=+18.224565759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert") pod "insights-operator-8f89dfddd-brq9l" (UID: "614f0a0f-5853-4cf6-bd3d-174141f0f1e2") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251769 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls podName:3b4f8517-1e54-4b41-ba6b-6c56fe66831a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.751756516 +0000 UTC m=+18.224593070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" (UID: "3b4f8517-1e54-4b41-ba6b-6c56fe66831a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251774 23041 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251785 23041 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251820 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls podName:84522c03-fd7b-4be7-9413-84e510b9dc5a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.751813547 +0000 UTC m=+18.224650101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-qldx6" (UID: "84522c03-fd7b-4be7-9413-84e510b9dc5a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.251837 master-0 kubenswrapper[23041]: E0308 00:31:52.251836 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls podName:e237ed52-5561-44c5-bcb1-de62691d6431 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.751830477 +0000 UTC m=+18.224667021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-qxpv9" (UID: "e237ed52-5561-44c5-bcb1-de62691d6431") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.252427 master-0 kubenswrapper[23041]: E0308 00:31:52.252399 23041 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.252482 master-0 kubenswrapper[23041]: E0308 00:31:52.252454 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config podName:5b9f4db1-3ba9-49a5-9a65-1d770ee59a65 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.752441802 +0000 UTC m=+18.225278396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-74cc79fd76-s9b9v" (UID: "5b9f4db1-3ba9-49a5-9a65-1d770ee59a65") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253306 master-0 kubenswrapper[23041]: E0308 00:31:52.253281 23041 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253407 master-0 kubenswrapper[23041]: E0308 00:31:52.253320 23041 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253407 master-0 kubenswrapper[23041]: E0308 00:31:52.253340 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls podName:460f09d8-a143-48d2-9db0-be247386984a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.753327984 +0000 UTC m=+18.226164578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6686554ddc-8krst" (UID: "460f09d8-a143-48d2-9db0-be247386984a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253407 master-0 kubenswrapper[23041]: E0308 00:31:52.253367 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert podName:84522c03-fd7b-4be7-9413-84e510b9dc5a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.753355084 +0000 UTC m=+18.226191648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert") pod "cluster-baremetal-operator-5cdb4c5598-qldx6" (UID: "84522c03-fd7b-4be7-9413-84e510b9dc5a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253407 master-0 kubenswrapper[23041]: E0308 00:31:52.253376 23041 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253407 master-0 kubenswrapper[23041]: E0308 00:31:52.253383 23041 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.253584 master-0 kubenswrapper[23041]: E0308 00:31:52.253415 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config podName:24ef1fb7-c8a1-4b50-b89f-2a81848ebb25 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.753405885 +0000 UTC m=+18.226242519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config") pod "node-exporter-bx9dn" (UID: "24ef1fb7-c8a1-4b50-b89f-2a81848ebb25") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.253584 master-0 kubenswrapper[23041]: E0308 00:31:52.253435 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca podName:1751db13-b792-43e2-8459-d1d4a0164dfb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.753423216 +0000 UTC m=+18.226259820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca") pod "apiserver-85cb8cb9bb-bmx44" (UID: "1751db13-b792-43e2-8459-d1d4a0164dfb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.255846 master-0 kubenswrapper[23041]: I0308 00:31:52.255780 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 00:31:52.276022 master-0 kubenswrapper[23041]: I0308 00:31:52.275949 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 00:31:52.290614 master-0 kubenswrapper[23041]: I0308 00:31:52.290563 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") pod \"66915251-1fdd-40f3-a59b-054776b214df\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " Mar 08 00:31:52.290732 master-0 kubenswrapper[23041]: I0308 00:31:52.290682 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock" (OuterVolumeSpecName: "var-lock") pod "66915251-1fdd-40f3-a59b-054776b214df" (UID: "66915251-1fdd-40f3-a59b-054776b214df"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:52.290732 master-0 kubenswrapper[23041]: I0308 00:31:52.290716 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") pod \"66915251-1fdd-40f3-a59b-054776b214df\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " Mar 08 00:31:52.290857 master-0 kubenswrapper[23041]: I0308 00:31:52.290815 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "66915251-1fdd-40f3-a59b-054776b214df" (UID: "66915251-1fdd-40f3-a59b-054776b214df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:52.292394 master-0 kubenswrapper[23041]: I0308 00:31:52.292373 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:52.292503 master-0 kubenswrapper[23041]: I0308 00:31:52.292395 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66915251-1fdd-40f3-a59b-054776b214df-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:52.302839 master-0 kubenswrapper[23041]: I0308 00:31:52.302783 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 00:31:52.316280 master-0 kubenswrapper[23041]: I0308 00:31:52.316231 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:31:52.337538 master-0 kubenswrapper[23041]: I0308 00:31:52.337463 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 00:31:52.347664 master-0 kubenswrapper[23041]: E0308 00:31:52.347596 23041 projected.go:194] Error preparing data for projected volume ca-certs for pod openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.347907 master-0 kubenswrapper[23041]: E0308 00:31:52.347700 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs podName:d01c21a1-6c2c-49a7-9d85-254662851838 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.847679434 +0000 UTC m=+18.320515988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ca-certs" (UniqueName: "kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs") pod "catalogd-controller-manager-7f8b8b6f4c-w2q2q" (UID: "d01c21a1-6c2c-49a7-9d85-254662851838") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.356656 master-0 kubenswrapper[23041]: E0308 00:31:52.354940 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/telemeter-client-serving-certs-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.356656 master-0 kubenswrapper[23041]: E0308 00:31:52.355013 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.85499606 +0000 UTC m=+18.327832604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-certs-ca-bundle" (UniqueName: "kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.356656 master-0 kubenswrapper[23041]: I0308 00:31:52.356028 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:31:52.356656 master-0 kubenswrapper[23041]: E0308 00:31:52.356336 23041 configmap.go:193] Couldn't get configMap openshift-console-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.356656 master-0 kubenswrapper[23041]: E0308 00:31:52.356415 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.856400854 +0000 UTC m=+18.329237408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.357051 master-0 kubenswrapper[23041]: E0308 00:31:52.356995 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.357051 master-0 kubenswrapper[23041]: E0308 00:31:52.357046 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.857033779 +0000 UTC m=+18.329870343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.357130 master-0 kubenswrapper[23041]: E0308 00:31:52.357049 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.357130 master-0 kubenswrapper[23041]: E0308 00:31:52.357086 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.85707725 +0000 UTC m=+18.329913804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-telemeter-client" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.359630 master-0 kubenswrapper[23041]: E0308 00:31:52.359599 23041 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.359696 master-0 kubenswrapper[23041]: E0308 00:31:52.359654 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert podName:e884e46e-e520-4e0a-9f15-43d4b74af63e nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.859642192 +0000 UTC m=+18.332478746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert") pod "ingress-canary-5qffz" (UID: "e884e46e-e520-4e0a-9f15-43d4b74af63e") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.360767 master-0 kubenswrapper[23041]: E0308 00:31:52.360732 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.360912 master-0 kubenswrapper[23041]: E0308 00:31:52.360752 23041 secret.go:189] Couldn't get secret openshift-monitoring/federate-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.360912 master-0 kubenswrapper[23041]: E0308 00:31:52.360859 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.860839911 +0000 UTC m=+18.333676465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.360912 master-0 kubenswrapper[23041]: E0308 00:31:52.360880 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.860873042 +0000 UTC m=+18.333709596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "federate-client-tls" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.361886 master-0 kubenswrapper[23041]: E0308 00:31:52.361856 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.361943 master-0 kubenswrapper[23041]: E0308 00:31:52.361905 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.861894466 +0000 UTC m=+18.334731020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.361984 master-0 kubenswrapper[23041]: E0308 00:31:52.361944 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-5fe8510kelpgf: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.361984 master-0 kubenswrapper[23041]: E0308 00:31:52.361968 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.861961878 +0000 UTC m=+18.334798432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.361984 master-0 kubenswrapper[23041]: E0308 00:31:52.361967 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362010 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.862000829 +0000 UTC m=+18.334837383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-telemeter-client-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362018 23041 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362019 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/telemeter-trusted-ca-bundle-8i12ta5c71j38: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362044 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.8620357 +0000 UTC m=+18.334872254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362067 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.8620504 +0000 UTC m=+18.334886954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.362072 master-0 kubenswrapper[23041]: E0308 00:31:52.362067 23041 secret.go:189] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.362272 master-0 kubenswrapper[23041]: E0308 00:31:52.362106 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.862097221 +0000 UTC m=+18.334933845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.366312 master-0 kubenswrapper[23041]: E0308 00:31:52.366250 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.366312 master-0 kubenswrapper[23041]: E0308 00:31:52.366267 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.366312 master-0 kubenswrapper[23041]: E0308 00:31:52.366292 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.366432 master-0 kubenswrapper[23041]: E0308 00:31:52.366296 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.866285062 +0000 UTC m=+18.339121616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.366432 master-0 kubenswrapper[23041]: E0308 00:31:52.366336 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.866327523 +0000 UTC m=+18.339164077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.366432 master-0 kubenswrapper[23041]: E0308 00:31:52.366350 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.866344793 +0000 UTC m=+18.339181347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:52.376660 master-0 kubenswrapper[23041]: I0308 00:31:52.376626 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:31:52.396088 master-0 kubenswrapper[23041]: I0308 00:31:52.396041 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-v5zml" Mar 08 00:31:52.417076 master-0 kubenswrapper[23041]: I0308 00:31:52.416984 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-gqbjw" Mar 08 00:31:52.435862 master-0 kubenswrapper[23041]: I0308 00:31:52.435815 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:31:52.456570 master-0 kubenswrapper[23041]: I0308 00:31:52.456526 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:31:52.467533 master-0 kubenswrapper[23041]: E0308 00:31:52.467489 23041 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.467716 master-0 kubenswrapper[23041]: E0308 00:31:52.467581 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert podName:2040e5dc-b314-46a9-a61b-e80f1a046ce3 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:52.967561879 +0000 UTC m=+18.440398433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert") pod "monitoring-plugin-6db79546f6-gdz4k" (UID: "2040e5dc-b314-46a9-a61b-e80f1a046ce3") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:52.475704 master-0 kubenswrapper[23041]: I0308 00:31:52.475669 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:31:52.495615 master-0 kubenswrapper[23041]: I0308 00:31:52.495570 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:31:52.516012 master-0 kubenswrapper[23041]: I0308 00:31:52.515980 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:31:52.535993 master-0 kubenswrapper[23041]: I0308 00:31:52.535934 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-wlrqc" Mar 08 00:31:52.556087 master-0 kubenswrapper[23041]: I0308 00:31:52.556021 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:31:52.576214 master-0 kubenswrapper[23041]: I0308 00:31:52.576161 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:31:52.597023 master-0 kubenswrapper[23041]: I0308 00:31:52.596974 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:31:52.617445 master-0 kubenswrapper[23041]: I0308 00:31:52.617392 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:31:52.636505 master-0 kubenswrapper[23041]: I0308 00:31:52.636460 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:31:52.656153 master-0 kubenswrapper[23041]: I0308 00:31:52.656113 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:31:52.676339 master-0 kubenswrapper[23041]: I0308 00:31:52.676254 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:31:52.696302 master-0 kubenswrapper[23041]: I0308 00:31:52.696261 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bkprm" Mar 08 00:31:52.716108 master-0 kubenswrapper[23041]: I0308 00:31:52.716057 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:31:52.736502 master-0 kubenswrapper[23041]: I0308 00:31:52.736457 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:31:52.756403 master-0 kubenswrapper[23041]: I0308 00:31:52.756363 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:31:52.776415 master-0 kubenswrapper[23041]: I0308 00:31:52.776369 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4tn2t" Mar 08 00:31:52.796215 master-0 kubenswrapper[23041]: I0308 00:31:52.796169 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:31:52.802849 master-0 kubenswrapper[23041]: I0308 00:31:52.802817 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:52.802919 master-0 kubenswrapper[23041]: I0308 00:31:52.802859 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:52.802919 master-0 kubenswrapper[23041]: I0308 00:31:52.802894 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.803014 master-0 kubenswrapper[23041]: I0308 00:31:52.802924 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:52.803275 master-0 kubenswrapper[23041]: I0308 00:31:52.803188 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.803334 master-0 kubenswrapper[23041]: I0308 00:31:52.803287 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:52.803376 master-0 kubenswrapper[23041]: I0308 00:31:52.803361 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:52.803421 master-0 kubenswrapper[23041]: I0308 00:31:52.803394 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:31:52.803466 master-0 kubenswrapper[23041]: I0308 00:31:52.803423 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:52.803577 master-0 kubenswrapper[23041]: I0308 00:31:52.803547 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:52.803634 master-0 kubenswrapper[23041]: I0308 00:31:52.803597 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.803634 master-0 kubenswrapper[23041]: I0308 00:31:52.803617 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:52.803716 master-0 kubenswrapper[23041]: I0308 00:31:52.803683 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:52.803768 master-0 kubenswrapper[23041]: I0308 00:31:52.803734 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:52.803814 master-0 kubenswrapper[23041]: I0308 00:31:52.803798 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:52.803859 master-0 kubenswrapper[23041]: I0308 00:31:52.803823 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-cabundle\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:52.803859 master-0 kubenswrapper[23041]: I0308 00:31:52.803823 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4f5539c1-fb87-42d6-b735-6de53421bb6b-signing-key\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:52.803945 master-0 kubenswrapper[23041]: I0308 00:31:52.803895 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:52.803990 master-0 kubenswrapper[23041]: I0308 00:31:52.803951 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:52.804034 master-0 kubenswrapper[23041]: I0308 00:31:52.803988 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.804034 master-0 kubenswrapper[23041]: I0308 00:31:52.804023 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:52.804182 master-0 kubenswrapper[23041]: I0308 00:31:52.804145 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:52.804259 master-0 kubenswrapper[23041]: I0308 00:31:52.804238 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.804316 master-0 kubenswrapper[23041]: I0308 00:31:52.804264 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.804316 master-0 kubenswrapper[23041]: I0308 00:31:52.804303 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.804397 master-0 kubenswrapper[23041]: I0308 00:31:52.804380 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:52.804475 master-0 kubenswrapper[23041]: I0308 00:31:52.804455 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.804570 master-0 kubenswrapper[23041]: I0308 00:31:52.804552 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:52.804657 master-0 kubenswrapper[23041]: I0308 00:31:52.804631 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-client\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.804824 master-0 kubenswrapper[23041]: I0308 00:31:52.804794 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:52.804914 master-0 kubenswrapper[23041]: I0308 00:31:52.804893 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:52.805046 master-0 kubenswrapper[23041]: I0308 00:31:52.805022 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:52.805102 master-0 kubenswrapper[23041]: I0308 00:31:52.805082 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:52.805163 master-0 kubenswrapper[23041]: I0308 00:31:52.805143 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.805277 master-0 kubenswrapper[23041]: I0308 00:31:52.805249 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:52.805374 master-0 kubenswrapper[23041]: I0308 00:31:52.805353 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:31:52.805447 master-0 kubenswrapper[23041]: I0308 00:31:52.805426 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:52.805499 master-0 kubenswrapper[23041]: I0308 00:31:52.805466 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.805544 master-0 kubenswrapper[23041]: I0308 00:31:52.805518 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.805595 master-0 kubenswrapper[23041]: I0308 00:31:52.805556 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:52.805653 master-0 kubenswrapper[23041]: I0308 00:31:52.805630 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:52.805701 master-0 kubenswrapper[23041]: I0308 00:31:52.805682 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:52.805751 master-0 kubenswrapper[23041]: I0308 00:31:52.805734 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/460f09d8-a143-48d2-9db0-be247386984a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:31:52.805805 master-0 kubenswrapper[23041]: I0308 00:31:52.805787 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.805879 master-0 kubenswrapper[23041]: I0308 00:31:52.805854 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:52.805951 master-0 kubenswrapper[23041]: I0308 00:31:52.805905 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:52.806003 master-0 kubenswrapper[23041]: I0308 00:31:52.805981 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-encryption-config\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.806045 master-0 kubenswrapper[23041]: I0308 00:31:52.806004 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:52.806174 master-0 kubenswrapper[23041]: I0308 00:31:52.806143 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:52.806250 master-0 kubenswrapper[23041]: I0308 00:31:52.806220 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-proxy-tls\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:52.806299 master-0 kubenswrapper[23041]: I0308 00:31:52.806276 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:52.806347 master-0 kubenswrapper[23041]: I0308 00:31:52.806308 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:52.806421 master-0 kubenswrapper[23041]: I0308 00:31:52.806399 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:52.806594 master-0 kubenswrapper[23041]: I0308 00:31:52.806499 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:52.806658 master-0 kubenswrapper[23041]: I0308 00:31:52.806632 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:52.806739 master-0 kubenswrapper[23041]: I0308 00:31:52.806717 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:52.806941 master-0 kubenswrapper[23041]: I0308 00:31:52.806919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:52.807019 master-0 kubenswrapper[23041]: I0308 00:31:52.806993 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807076 master-0 kubenswrapper[23041]: I0308 00:31:52.807054 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807123 master-0 kubenswrapper[23041]: I0308 00:31:52.807078 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:52.807123 master-0 kubenswrapper[23041]: I0308 00:31:52.807116 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.807265 master-0 kubenswrapper[23041]: I0308 00:31:52.807140 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807317 master-0 kubenswrapper[23041]: I0308 00:31:52.807287 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:52.807363 master-0 kubenswrapper[23041]: I0308 00:31:52.807332 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-etcd-serving-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807363 master-0 kubenswrapper[23041]: I0308 00:31:52.807340 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:52.807446 master-0 kubenswrapper[23041]: I0308 00:31:52.807358 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-audit\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807446 master-0 kubenswrapper[23041]: I0308 00:31:52.807417 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807536 master-0 kubenswrapper[23041]: I0308 00:31:52.807457 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:52.807536 master-0 kubenswrapper[23041]: I0308 00:31:52.807485 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1751db13-b792-43e2-8459-d1d4a0164dfb-serving-cert\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.807536 master-0 kubenswrapper[23041]: I0308 00:31:52.807508 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:52.807662 master-0 kubenswrapper[23041]: I0308 00:31:52.807548 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:52.807662 master-0 kubenswrapper[23041]: I0308 00:31:52.807086 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/d01c21a1-6c2c-49a7-9d85-254662851838-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:52.807730 master-0 kubenswrapper[23041]: I0308 00:31:52.807645 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:52.819563 master-0 kubenswrapper[23041]: I0308 00:31:52.819495 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 00:31:52.819907 master-0 kubenswrapper[23041]: I0308 00:31:52.819875 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:31:52.826182 master-0 kubenswrapper[23041]: I0308 00:31:52.825940 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d810f7f-258a-47ce-9f99-7b1d93388aee-images\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.836952 master-0 kubenswrapper[23041]: I0308 00:31:52.836903 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:31:52.838535 master-0 kubenswrapper[23041]: I0308 00:31:52.838513 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d810f7f-258a-47ce-9f99-7b1d93388aee-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:52.855770 master-0 kubenswrapper[23041]: I0308 00:31:52.855733 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-l6qr9" Mar 08 00:31:52.876373 master-0 kubenswrapper[23041]: I0308 00:31:52.876328 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 00:31:52.885530 master-0 kubenswrapper[23041]: I0308 00:31:52.885488 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-serving-cert\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:52.896551 master-0 kubenswrapper[23041]: I0308 00:31:52.896499 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 00:31:52.904158 master-0 kubenswrapper[23041]: I0308 00:31:52.904126 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-images\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.908936 master-0 kubenswrapper[23041]: I0308 00:31:52.908870 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:52.909169 master-0 kubenswrapper[23041]: I0308 00:31:52.909129 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:52.909243 master-0 kubenswrapper[23041]: I0308 00:31:52.909223 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:52.909382 master-0 kubenswrapper[23041]: I0308 00:31:52.909357 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.909524 master-0 kubenswrapper[23041]: I0308 00:31:52.909498 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.909558 master-0 kubenswrapper[23041]: I0308 00:31:52.909509 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:52.909655 master-0 kubenswrapper[23041]: I0308 00:31:52.909624 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:52.910100 master-0 kubenswrapper[23041]: I0308 00:31:52.910066 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.910146 master-0 kubenswrapper[23041]: I0308 00:31:52.910127 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:52.910276 master-0 kubenswrapper[23041]: I0308 00:31:52.910186 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:52.910401 master-0 kubenswrapper[23041]: I0308 00:31:52.910368 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.910460 master-0 kubenswrapper[23041]: I0308 00:31:52.910438 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.910606 master-0 kubenswrapper[23041]: I0308 00:31:52.910577 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:52.910689 master-0 kubenswrapper[23041]: I0308 00:31:52.910667 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.910748 master-0 kubenswrapper[23041]: I0308 00:31:52.910725 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:52.910837 master-0 kubenswrapper[23041]: I0308 00:31:52.910813 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:52.910906 master-0 kubenswrapper[23041]: I0308 00:31:52.910883 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:52.911066 master-0 kubenswrapper[23041]: I0308 00:31:52.911032 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:52.924283 master-0 kubenswrapper[23041]: I0308 00:31:52.924197 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 00:31:52.926531 master-0 kubenswrapper[23041]: I0308 00:31:52.926406 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.936800 master-0 kubenswrapper[23041]: I0308 00:31:52.936756 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:31:52.946432 master-0 kubenswrapper[23041]: I0308 00:31:52.946393 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-image-import-ca\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:52.956935 master-0 kubenswrapper[23041]: I0308 00:31:52.956898 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 00:31:52.965800 master-0 kubenswrapper[23041]: I0308 00:31:52.965756 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/84522c03-fd7b-4be7-9413-84e510b9dc5a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:52.976588 master-0 kubenswrapper[23041]: I0308 00:31:52.976544 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-nppj6" Mar 08 00:31:52.995904 master-0 kubenswrapper[23041]: I0308 00:31:52.995848 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 00:31:53.005015 master-0 kubenswrapper[23041]: I0308 00:31:53.004955 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84522c03-fd7b-4be7-9413-84e510b9dc5a-config\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:53.012570 master-0 kubenswrapper[23041]: I0308 00:31:53.012531 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert\") pod \"monitoring-plugin-6db79546f6-gdz4k\" (UID: \"2040e5dc-b314-46a9-a61b-e80f1a046ce3\") " pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:53.026067 master-0 kubenswrapper[23041]: I0308 00:31:53.026009 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:31:53.029465 master-0 kubenswrapper[23041]: I0308 00:31:53.029409 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1751db13-b792-43e2-8459-d1d4a0164dfb-trusted-ca-bundle\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:53.036509 master-0 kubenswrapper[23041]: I0308 00:31:53.036451 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:31:53.056368 master-0 kubenswrapper[23041]: I0308 00:31:53.056302 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 00:31:53.065196 master-0 kubenswrapper[23041]: I0308 00:31:53.065151 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-service-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:53.082365 master-0 kubenswrapper[23041]: I0308 00:31:53.082318 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 00:31:53.083481 master-0 kubenswrapper[23041]: I0308 00:31:53.083453 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:53.096803 master-0 kubenswrapper[23041]: I0308 00:31:53.096714 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 00:31:53.117566 master-0 kubenswrapper[23041]: I0308 00:31:53.117491 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-94mhc" Mar 08 00:31:53.137783 master-0 kubenswrapper[23041]: I0308 00:31:53.137726 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-7cd6d" Mar 08 00:31:53.155563 master-0 kubenswrapper[23041]: I0308 00:31:53.155483 23041 request.go:700] Waited for 2.00157844s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dmachine-api-operator-dockercfg-ppd6p&limit=500&resourceVersion=0 Mar 08 00:31:53.157642 master-0 kubenswrapper[23041]: I0308 00:31:53.157565 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-ppd6p" Mar 08 00:31:53.172470 master-0 kubenswrapper[23041]: I0308 00:31:53.172416 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:53.176327 master-0 kubenswrapper[23041]: I0308 00:31:53.176283 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:31:53.196722 master-0 kubenswrapper[23041]: I0308 00:31:53.196600 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:31:53.204010 master-0 kubenswrapper[23041]: I0308 00:31:53.203969 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f42081-387d-4798-b981-ac232e851bb4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:31:53.218302 master-0 kubenswrapper[23041]: I0308 00:31:53.218247 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:31:53.224292 master-0 kubenswrapper[23041]: I0308 00:31:53.224241 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-images\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:53.236803 master-0 kubenswrapper[23041]: I0308 00:31:53.236747 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:31:53.256324 master-0 kubenswrapper[23041]: I0308 00:31:53.256270 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:31:53.258030 master-0 kubenswrapper[23041]: I0308 00:31:53.257985 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7097f64-1709-4f76-a725-5a6c6cc5919b-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:53.277590 master-0 kubenswrapper[23041]: I0308 00:31:53.277539 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:31:53.284088 master-0 kubenswrapper[23041]: I0308 00:31:53.284025 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7097f64-1709-4f76-a725-5a6c6cc5919b-config\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:53.297119 master-0 kubenswrapper[23041]: I0308 00:31:53.296883 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-qvcg8" Mar 08 00:31:53.316765 master-0 kubenswrapper[23041]: I0308 00:31:53.316696 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:31:53.327220 master-0 kubenswrapper[23041]: I0308 00:31:53.327139 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:53.337155 master-0 kubenswrapper[23041]: I0308 00:31:53.337111 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:31:53.344323 master-0 kubenswrapper[23041]: I0308 00:31:53.344283 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-apiservice-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:53.345275 master-0 kubenswrapper[23041]: I0308 00:31:53.345200 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-webhook-cert\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:53.357183 master-0 kubenswrapper[23041]: I0308 00:31:53.357113 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 00:31:53.377195 master-0 kubenswrapper[23041]: I0308 00:31:53.377117 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:31:53.386712 master-0 kubenswrapper[23041]: I0308 00:31:53.386657 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:53.397659 master-0 kubenswrapper[23041]: I0308 00:31:53.397599 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:31:53.416721 master-0 kubenswrapper[23041]: I0308 00:31:53.416660 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:31:53.426329 master-0 kubenswrapper[23041]: I0308 00:31:53.426301 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7317ceda-df6f-4826-aa1a-15304c2b0fcd-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:53.436295 master-0 kubenswrapper[23041]: I0308 00:31:53.436265 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-njqpw" Mar 08 00:31:53.456746 master-0 kubenswrapper[23041]: I0308 00:31:53.456647 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:31:53.464495 master-0 kubenswrapper[23041]: I0308 00:31:53.464457 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7317ceda-df6f-4826-aa1a-15304c2b0fcd-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:53.475639 master-0 kubenswrapper[23041]: I0308 00:31:53.475601 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:31:53.495684 master-0 kubenswrapper[23041]: I0308 00:31:53.495645 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-jt6pk" Mar 08 00:31:53.515666 master-0 kubenswrapper[23041]: I0308 00:31:53.515607 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-wkgmq" Mar 08 00:31:53.537080 master-0 kubenswrapper[23041]: I0308 00:31:53.536988 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:31:53.546801 master-0 kubenswrapper[23041]: I0308 00:31:53.546756 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-node-bootstrap-token\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:53.556866 master-0 kubenswrapper[23041]: I0308 00:31:53.556814 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:31:53.565983 master-0 kubenswrapper[23041]: I0308 00:31:53.565948 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a68ad726-392e-4a7a-a384-409108df9c8b-certs\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:53.575999 master-0 kubenswrapper[23041]: I0308 00:31:53.575970 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qlv59" Mar 08 00:31:53.596017 master-0 kubenswrapper[23041]: I0308 00:31:53.595968 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 00:31:53.598416 master-0 kubenswrapper[23041]: I0308 00:31:53.598377 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:53.616890 master-0 kubenswrapper[23041]: I0308 00:31:53.616846 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-fp767" Mar 08 00:31:53.635960 master-0 kubenswrapper[23041]: I0308 00:31:53.635919 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 00:31:53.646101 master-0 kubenswrapper[23041]: I0308 00:31:53.646039 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:53.656843 master-0 kubenswrapper[23041]: I0308 00:31:53.656786 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 00:31:53.664665 master-0 kubenswrapper[23041]: I0308 00:31:53.664624 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:53.676272 master-0 kubenswrapper[23041]: I0308 00:31:53.676251 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 00:31:53.677191 master-0 kubenswrapper[23041]: I0308 00:31:53.677160 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:53.677402 master-0 kubenswrapper[23041]: I0308 00:31:53.677382 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:53.677554 master-0 kubenswrapper[23041]: I0308 00:31:53.677532 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-metrics-client-ca\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:53.677982 master-0 kubenswrapper[23041]: I0308 00:31:53.677961 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e237ed52-5561-44c5-bcb1-de62691d6431-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:53.680906 master-0 kubenswrapper[23041]: I0308 00:31:53.680879 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-metrics-client-ca\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:53.696185 master-0 kubenswrapper[23041]: I0308 00:31:53.696162 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 00:31:53.706676 master-0 kubenswrapper[23041]: I0308 00:31:53.706638 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:53.715874 master-0 kubenswrapper[23041]: I0308 00:31:53.715811 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:31:53.735624 master-0 kubenswrapper[23041]: I0308 00:31:53.735589 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bmgck" Mar 08 00:31:53.756053 master-0 kubenswrapper[23041]: I0308 00:31:53.756016 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 00:31:53.766378 master-0 kubenswrapper[23041]: I0308 00:31:53.766329 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:53.777028 master-0 kubenswrapper[23041]: I0308 00:31:53.776969 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 00:31:53.786881 master-0 kubenswrapper[23041]: I0308 00:31:53.786811 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:53.796973 master-0 kubenswrapper[23041]: I0308 00:31:53.796930 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:31:53.803550 master-0 kubenswrapper[23041]: E0308 00:31:53.803517 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.803620 master-0 kubenswrapper[23041]: E0308 00:31:53.803590 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.80357277 +0000 UTC m=+20.276409324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.804065 master-0 kubenswrapper[23041]: E0308 00:31:53.804035 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.804110 master-0 kubenswrapper[23041]: E0308 00:31:53.804080 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.804071912 +0000 UTC m=+20.276908466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804239 23041 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804286 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.804272887 +0000 UTC m=+20.277109451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804296 23041 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804352 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804394 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls podName:ae061e84-5e6a-415c-a735-fa14add7318a nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.80437437 +0000 UTC m=+20.277210914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-qjxhc" (UID: "ae061e84-5e6a-415c-a735-fa14add7318a") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804296 23041 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.804431 master-0 kubenswrapper[23041]: E0308 00:31:53.804414 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.80440522 +0000 UTC m=+20.277241904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.804658 master-0 kubenswrapper[23041]: E0308 00:31:53.804454 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config podName:e237ed52-5561-44c5-bcb1-de62691d6431 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.804442381 +0000 UTC m=+20.277278935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-5ff8674d55-qxpv9" (UID: "e237ed52-5561-44c5-bcb1-de62691d6431") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.805452 master-0 kubenswrapper[23041]: E0308 00:31:53.805425 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.805524 master-0 kubenswrapper[23041]: E0308 00:31:53.805491 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.805478356 +0000 UTC m=+20.278315010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.806611 master-0 kubenswrapper[23041]: E0308 00:31:53.806513 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.806611 master-0 kubenswrapper[23041]: E0308 00:31:53.806534 23041 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.806611 master-0 kubenswrapper[23041]: E0308 00:31:53.806569 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.806557862 +0000 UTC m=+20.279394486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.806611 master-0 kubenswrapper[23041]: E0308 00:31:53.806589 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config podName:24ef1fb7-c8a1-4b50-b89f-2a81848ebb25 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.806581523 +0000 UTC m=+20.279418197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config") pod "node-exporter-bx9dn" (UID: "24ef1fb7-c8a1-4b50-b89f-2a81848ebb25") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807690 master-0 kubenswrapper[23041]: E0308 00:31:53.807651 23041 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807690 master-0 kubenswrapper[23041]: E0308 00:31:53.807670 23041 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807804 master-0 kubenswrapper[23041]: E0308 00:31:53.807695 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs podName:1da0c222-4b59-424f-9817-48673083df00 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.807686449 +0000 UTC m=+20.280523003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs") pod "multus-admission-controller-7769569c45-5n69x" (UID: "1da0c222-4b59-424f-9817-48673083df00") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807804 master-0 kubenswrapper[23041]: E0308 00:31:53.807703 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807804 master-0 kubenswrapper[23041]: E0308 00:31:53.807713 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls podName:24ef1fb7-c8a1-4b50-b89f-2a81848ebb25 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.80770668 +0000 UTC m=+20.280543234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls") pod "node-exporter-bx9dn" (UID: "24ef1fb7-c8a1-4b50-b89f-2a81848ebb25") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.807804 master-0 kubenswrapper[23041]: E0308 00:31:53.807749 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.807738001 +0000 UTC m=+20.280574635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.816684 master-0 kubenswrapper[23041]: I0308 00:31:53.816628 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 00:31:53.835962 master-0 kubenswrapper[23041]: I0308 00:31:53.835905 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 00:31:53.856178 master-0 kubenswrapper[23041]: I0308 00:31:53.856123 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 00:31:53.875997 master-0 kubenswrapper[23041]: I0308 00:31:53.875950 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 00:31:53.896823 master-0 kubenswrapper[23041]: I0308 00:31:53.896740 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5qzcm" Mar 08 00:31:53.909905 master-0 kubenswrapper[23041]: E0308 00:31:53.909834 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/telemeter-client-serving-certs-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.909905 master-0 kubenswrapper[23041]: E0308 00:31:53.909896 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910192 master-0 kubenswrapper[23041]: E0308 00:31:53.909854 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910192 master-0 kubenswrapper[23041]: E0308 00:31:53.909862 23041 configmap.go:193] Couldn't get configMap openshift-console-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910192 master-0 kubenswrapper[23041]: E0308 00:31:53.909957 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.90992956 +0000 UTC m=+20.382766114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-certs-ca-bundle" (UniqueName: "kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910192 master-0 kubenswrapper[23041]: E0308 00:31:53.910164 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910123125 +0000 UTC m=+20.382959709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910233 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910194316 +0000 UTC m=+20.383030910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910234 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910260 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910244897 +0000 UTC m=+20.383081481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910290 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910276948 +0000 UTC m=+20.383113542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910291 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910298 23041 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910337 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910320059 +0000 UTC m=+20.383156613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-telemeter-client" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910419 master-0 kubenswrapper[23041]: E0308 00:31:53.910355 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert podName:e884e46e-e520-4e0a-9f15-43d4b74af63e nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.91034572 +0000 UTC m=+20.383182274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert") pod "ingress-canary-5qffz" (UID: "e884e46e-e520-4e0a-9f15-43d4b74af63e") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910864 master-0 kubenswrapper[23041]: E0308 00:31:53.910777 23041 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.910864 master-0 kubenswrapper[23041]: E0308 00:31:53.910795 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910864 master-0 kubenswrapper[23041]: E0308 00:31:53.910832 23041 secret.go:189] Couldn't get secret openshift-monitoring/federate-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910864 master-0 kubenswrapper[23041]: E0308 00:31:53.910852 23041 secret.go:189] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.910864 master-0 kubenswrapper[23041]: E0308 00:31:53.910839 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910826471 +0000 UTC m=+20.383663025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910862 23041 configmap.go:193] Couldn't get configMap openshift-monitoring/telemeter-trusted-ca-bundle-8i12ta5c71j38: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910929 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910907 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910877353 +0000 UTC m=+20.383713907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910962 23041 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910976 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910965975 +0000 UTC m=+20.383802639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "federate-client-tls" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.910998 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert podName:b22c3046-5193-4c1d-91c0-7c15745265be nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.910986255 +0000 UTC m=+20.383822929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert") pod "console-operator-6c7fb6b958-db7d8" (UID: "b22c3046-5193-4c1d-91c0-7c15745265be") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.911016 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.911009546 +0000 UTC m=+20.383846100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.911027 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.911022286 +0000 UTC m=+20.383858840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.911042 master-0 kubenswrapper[23041]: E0308 00:31:53.911042 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config podName:795e6115-95cc-4c0a-a407-e0a6f14118e5 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.911033456 +0000 UTC m=+20.383870010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-telemeter-client-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config") pod "telemeter-client-6cfc594d97-x62fk" (UID: "795e6115-95cc-4c0a-a407-e0a6f14118e5") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.912376 master-0 kubenswrapper[23041]: E0308 00:31:53.911471 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-5fe8510kelpgf: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.912376 master-0 kubenswrapper[23041]: E0308 00:31:53.911515 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle podName:c26f36ee-5dd4-40b7-8cb9-7f4835f120fd nodeName:}" failed. No retries permitted until 2026-03-08 00:31:54.911500618 +0000 UTC m=+20.384337212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle") pod "metrics-server-7b45f5889c-z48tj" (UID: "c26f36ee-5dd4-40b7-8cb9-7f4835f120fd") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:53.916328 master-0 kubenswrapper[23041]: I0308 00:31:53.916287 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:31:53.936141 master-0 kubenswrapper[23041]: I0308 00:31:53.936086 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsc4j" Mar 08 00:31:53.956588 master-0 kubenswrapper[23041]: I0308 00:31:53.956542 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-8nl72" Mar 08 00:31:53.976623 master-0 kubenswrapper[23041]: I0308 00:31:53.976525 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 00:31:53.998273 master-0 kubenswrapper[23041]: I0308 00:31:53.998177 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 00:31:54.013079 master-0 kubenswrapper[23041]: E0308 00:31:54.013000 23041 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:54.013377 master-0 kubenswrapper[23041]: E0308 00:31:54.013239 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert podName:2040e5dc-b314-46a9-a61b-e80f1a046ce3 nodeName:}" failed. No retries permitted until 2026-03-08 00:31:55.013180675 +0000 UTC m=+20.486017269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert") pod "monitoring-plugin-6db79546f6-gdz4k" (UID: "2040e5dc-b314-46a9-a61b-e80f1a046ce3") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:31:54.018041 master-0 kubenswrapper[23041]: I0308 00:31:54.017999 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 00:31:54.037879 master-0 kubenswrapper[23041]: I0308 00:31:54.037815 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-2mmjv" Mar 08 00:31:54.057772 master-0 kubenswrapper[23041]: I0308 00:31:54.057707 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-ffspe3f0nbfal" Mar 08 00:31:54.076331 master-0 kubenswrapper[23041]: I0308 00:31:54.076292 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 00:31:54.096181 master-0 kubenswrapper[23041]: I0308 00:31:54.096145 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 00:31:54.117535 master-0 kubenswrapper[23041]: I0308 00:31:54.117475 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 00:31:54.136790 master-0 kubenswrapper[23041]: I0308 00:31:54.136745 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rml7" Mar 08 00:31:54.155974 master-0 kubenswrapper[23041]: I0308 00:31:54.155941 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:31:54.175317 master-0 kubenswrapper[23041]: I0308 00:31:54.174555 23041 request.go:700] Waited for 2.989354419s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Mar 08 00:31:54.176342 master-0 kubenswrapper[23041]: I0308 00:31:54.176302 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:31:54.196297 master-0 kubenswrapper[23041]: I0308 00:31:54.196255 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:31:54.217049 master-0 kubenswrapper[23041]: I0308 00:31:54.217015 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:31:54.237285 master-0 kubenswrapper[23041]: I0308 00:31:54.237159 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:31:54.256492 master-0 kubenswrapper[23041]: I0308 00:31:54.256443 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:31:54.279864 master-0 kubenswrapper[23041]: I0308 00:31:54.279828 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:31:54.296994 master-0 kubenswrapper[23041]: I0308 00:31:54.296950 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:31:54.316843 master-0 kubenswrapper[23041]: I0308 00:31:54.316782 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:31:54.336732 master-0 kubenswrapper[23041]: I0308 00:31:54.336689 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5fe8510kelpgf" Mar 08 00:31:54.356029 master-0 kubenswrapper[23041]: I0308 00:31:54.355983 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 08 00:31:54.375721 master-0 kubenswrapper[23041]: I0308 00:31:54.375673 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 08 00:31:54.395862 master-0 kubenswrapper[23041]: I0308 00:31:54.395827 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 08 00:31:54.416425 master-0 kubenswrapper[23041]: I0308 00:31:54.416383 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 08 00:31:54.436416 master-0 kubenswrapper[23041]: I0308 00:31:54.436351 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-cm5m6" Mar 08 00:31:54.455769 master-0 kubenswrapper[23041]: I0308 00:31:54.455719 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 08 00:31:54.482869 master-0 kubenswrapper[23041]: I0308 00:31:54.482816 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 08 00:31:54.496247 master-0 kubenswrapper[23041]: I0308 00:31:54.496185 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-hgdt2" Mar 08 00:31:54.516469 master-0 kubenswrapper[23041]: I0308 00:31:54.516424 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 00:31:54.550036 master-0 kubenswrapper[23041]: I0308 00:31:54.549959 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzcz\" (UniqueName: \"kubernetes.io/projected/ec2d22f2-c260-42a6-a9da-ee0f44f42303-kube-api-access-xlzcz\") pod \"network-operator-7c649bf6d4-st2sr\" (UID: \"ec2d22f2-c260-42a6-a9da-ee0f44f42303\") " pod="openshift-network-operator/network-operator-7c649bf6d4-st2sr" Mar 08 00:31:54.567406 master-0 kubenswrapper[23041]: I0308 00:31:54.567362 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22cn\" (UniqueName: \"kubernetes.io/projected/0f496486-70d5-4c5c-b4f3-6cc19427762f-kube-api-access-l22cn\") pod \"cluster-storage-operator-6fbfc8dc8f-sdsks\" (UID: \"0f496486-70d5-4c5c-b4f3-6cc19427762f\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" Mar 08 00:31:54.587383 master-0 kubenswrapper[23041]: I0308 00:31:54.587341 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tml5\" (UniqueName: \"kubernetes.io/projected/b94acad3-cf4e-443d-80fb-5e68a4074336-kube-api-access-7tml5\") pod \"catalog-operator-7d9c49f57b-8jr6f\" (UID: \"b94acad3-cf4e-443d-80fb-5e68a4074336\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:54.607598 master-0 kubenswrapper[23041]: I0308 00:31:54.607545 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5kd\" (UniqueName: \"kubernetes.io/projected/3d2e1686-3a30-4021-9c03-02e472bc6ff3-kube-api-access-qv5kd\") pod \"cluster-autoscaler-operator-69576476f7-dpg4q\" (UID: \"3d2e1686-3a30-4021-9c03-02e472bc6ff3\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" Mar 08 00:31:54.628067 master-0 kubenswrapper[23041]: I0308 00:31:54.628003 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t99pg\" (UniqueName: \"kubernetes.io/projected/e237ed52-5561-44c5-bcb1-de62691d6431-kube-api-access-t99pg\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:54.648643 master-0 kubenswrapper[23041]: I0308 00:31:54.648592 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6hn\" (UniqueName: \"kubernetes.io/projected/c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8-kube-api-access-5q6hn\") pod \"csi-snapshot-controller-operator-5685fbc7d-5v8g4\" (UID: \"c1abfb79-2c86-4ccb-bf91-7c48ad8c78d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-5v8g4" Mar 08 00:31:54.666756 master-0 kubenswrapper[23041]: I0308 00:31:54.666719 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcl7q\" (UniqueName: \"kubernetes.io/projected/4f5539c1-fb87-42d6-b735-6de53421bb6b-kube-api-access-bcl7q\") pod \"service-ca-84bfdbbb7f-bc2m2\" (UID: \"4f5539c1-fb87-42d6-b735-6de53421bb6b\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-bc2m2" Mar 08 00:31:54.688117 master-0 kubenswrapper[23041]: I0308 00:31:54.688034 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365dc4ac-fbc8-4589-a799-8327b3ebd0a5-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-pfdrx\" (UID: \"365dc4ac-fbc8-4589-a799-8327b3ebd0a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-pfdrx" Mar 08 00:31:54.708001 master-0 kubenswrapper[23041]: I0308 00:31:54.707941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmgb\" (UniqueName: \"kubernetes.io/projected/e302bc0b-7560-4f84-813f-d966c2dbe47c-kube-api-access-9bmgb\") pod \"dns-default-jfjzg\" (UID: \"e302bc0b-7560-4f84-813f-d966c2dbe47c\") " pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:54.727078 master-0 kubenswrapper[23041]: I0308 00:31:54.727021 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfg9\" (UniqueName: \"kubernetes.io/projected/531e9339-968c-47bf-b8ea-c44d9ceef4b3-kube-api-access-crfg9\") pod \"apiserver-74444d8fbc-g7z4w\" (UID: \"531e9339-968c-47bf-b8ea-c44d9ceef4b3\") " pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:54.748609 master-0 kubenswrapper[23041]: I0308 00:31:54.748481 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8sl\" (UniqueName: \"kubernetes.io/projected/460f09d8-a143-48d2-9db0-be247386984a-kube-api-access-vj8sl\") pod \"control-plane-machine-set-operator-6686554ddc-8krst\" (UID: \"460f09d8-a143-48d2-9db0-be247386984a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" Mar 08 00:31:54.767445 master-0 kubenswrapper[23041]: I0308 00:31:54.767405 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6nz\" (UniqueName: \"kubernetes.io/projected/815fd565-0609-4d8f-ac05-8656f198b008-kube-api-access-sh6nz\") pod \"network-metrics-daemon-krv7c\" (UID: \"815fd565-0609-4d8f-ac05-8656f198b008\") " pod="openshift-multus/network-metrics-daemon-krv7c" Mar 08 00:31:54.792850 master-0 kubenswrapper[23041]: I0308 00:31:54.792789 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txt48\" (UniqueName: \"kubernetes.io/projected/1da0c222-4b59-424f-9817-48673083df00-kube-api-access-txt48\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:54.814973 master-0 kubenswrapper[23041]: I0308 00:31:54.814920 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8zb\" (UniqueName: \"kubernetes.io/projected/84522c03-fd7b-4be7-9413-84e510b9dc5a-kube-api-access-ht8zb\") pod \"cluster-baremetal-operator-5cdb4c5598-qldx6\" (UID: \"84522c03-fd7b-4be7-9413-84e510b9dc5a\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" Mar 08 00:31:54.829791 master-0 kubenswrapper[23041]: I0308 00:31:54.829739 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6w4\" (UniqueName: \"kubernetes.io/projected/d70f4efb-e61a-4e88-a271-2f4af21ecdf3-kube-api-access-pt6w4\") pod \"packageserver-9c44c86f9-rplwv\" (UID: \"d70f4efb-e61a-4e88-a271-2f4af21ecdf3\") " pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:54.848402 master-0 kubenswrapper[23041]: I0308 00:31:54.848340 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsqr\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-kube-api-access-pwsqr\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:54.850929 master-0 kubenswrapper[23041]: I0308 00:31:54.850901 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:54.851180 master-0 kubenswrapper[23041]: I0308 00:31:54.851138 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.851340 master-0 kubenswrapper[23041]: I0308 00:31:54.851317 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:54.851489 master-0 kubenswrapper[23041]: I0308 00:31:54.851446 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:54.851567 master-0 kubenswrapper[23041]: I0308 00:31:54.851549 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:54.851598 master-0 kubenswrapper[23041]: I0308 00:31:54.851458 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.851682 master-0 kubenswrapper[23041]: I0308 00:31:54.851666 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-node-exporter-tls\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:54.851749 master-0 kubenswrapper[23041]: I0308 00:31:54.851727 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.851803 master-0 kubenswrapper[23041]: E0308 00:31:54.851785 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:31:54.851895 master-0 kubenswrapper[23041]: I0308 00:31:54.851873 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.851930 master-0 kubenswrapper[23041]: I0308 00:31:54.851879 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1da0c222-4b59-424f-9817-48673083df00-webhook-certs\") pod \"multus-admission-controller-7769569c45-5n69x\" (UID: \"1da0c222-4b59-424f-9817-48673083df00\") " pod="openshift-multus/multus-admission-controller-7769569c45-5n69x" Mar 08 00:31:54.851930 master-0 kubenswrapper[23041]: E0308 00:31:54.851908 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:31:56.851888269 +0000 UTC m=+22.324724823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:31:54.852012 master-0 kubenswrapper[23041]: I0308 00:31:54.851991 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.852130 master-0 kubenswrapper[23041]: I0308 00:31:54.852110 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.852188 master-0 kubenswrapper[23041]: I0308 00:31:54.852172 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.852234 master-0 kubenswrapper[23041]: I0308 00:31:54.852216 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.852234 master-0 kubenswrapper[23041]: I0308 00:31:54.852227 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.852322 master-0 kubenswrapper[23041]: I0308 00:31:54.852301 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:54.852353 master-0 kubenswrapper[23041]: I0308 00:31:54.852313 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.852400 master-0 kubenswrapper[23041]: I0308 00:31:54.852385 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.852433 master-0 kubenswrapper[23041]: I0308 00:31:54.852402 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae061e84-5e6a-415c-a735-fa14add7318a-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:54.852433 master-0 kubenswrapper[23041]: I0308 00:31:54.852390 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.852523 master-0 kubenswrapper[23041]: I0308 00:31:54.852502 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e237ed52-5561-44c5-bcb1-de62691d6431-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-qxpv9\" (UID: \"e237ed52-5561-44c5-bcb1-de62691d6431\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-qxpv9" Mar 08 00:31:54.852570 master-0 kubenswrapper[23041]: I0308 00:31:54.852541 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.852799 master-0 kubenswrapper[23041]: I0308 00:31:54.852777 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:54.867577 master-0 kubenswrapper[23041]: I0308 00:31:54.867540 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnhh\" (UniqueName: \"kubernetes.io/projected/2b1a69b5-c946-495d-ae02-c56f788279e8-kube-api-access-chnhh\") pod \"openshift-config-operator-64488f9d78-vnl28\" (UID: \"2b1a69b5-c946-495d-ae02-c56f788279e8\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:54.888439 master-0 kubenswrapper[23041]: I0308 00:31:54.888370 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9fv4\" (UniqueName: \"kubernetes.io/projected/6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6-kube-api-access-x9fv4\") pod \"router-default-79f8cd6fdd-r6nkv\" (UID: \"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6\") " pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:54.907197 master-0 kubenswrapper[23041]: I0308 00:31:54.907152 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl4m\" (UniqueName: \"kubernetes.io/projected/af391724-079a-4bac-a89e-978ffd471763-kube-api-access-gkl4m\") pod \"network-node-identity-m7549\" (UID: \"af391724-079a-4bac-a89e-978ffd471763\") " pod="openshift-network-node-identity/network-node-identity-m7549" Mar 08 00:31:54.933370 master-0 kubenswrapper[23041]: I0308 00:31:54.933336 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s99rr\" (UniqueName: \"kubernetes.io/projected/fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9-kube-api-access-s99rr\") pod \"ovnkube-node-2w9mf\" (UID: \"fdf0db9d-51bb-41c8-a2f8-3aaa0df679e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:54.953030 master-0 kubenswrapper[23041]: I0308 00:31:54.952979 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjcjb\" (UniqueName: \"kubernetes.io/projected/ac523956-c8a3-4794-a1fa-660cd14966bb-kube-api-access-wjcjb\") pod \"kube-storage-version-migrator-operator-7f65c457f5-st7mk\" (UID: \"ac523956-c8a3-4794-a1fa-660cd14966bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-st7mk" Mar 08 00:31:54.953467 master-0 kubenswrapper[23041]: I0308 00:31:54.953438 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.953755 master-0 kubenswrapper[23041]: I0308 00:31:54.953697 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.954225 master-0 kubenswrapper[23041]: I0308 00:31:54.954156 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.954290 master-0 kubenswrapper[23041]: I0308 00:31:54.954222 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.954434 master-0 kubenswrapper[23041]: I0308 00:31:54.954395 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-serving-certs-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.954576 master-0 kubenswrapper[23041]: I0308 00:31:54.954536 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:54.954679 master-0 kubenswrapper[23041]: I0308 00:31:54.954647 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.954729 master-0 kubenswrapper[23041]: I0308 00:31:54.954711 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.954833 master-0 kubenswrapper[23041]: I0308 00:31:54.954806 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.954903 master-0 kubenswrapper[23041]: I0308 00:31:54.954838 23041 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 00:31:54.954903 master-0 kubenswrapper[23041]: I0308 00:31:54.954875 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.954986 master-0 kubenswrapper[23041]: I0308 00:31:54.954919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.955268 master-0 kubenswrapper[23041]: I0308 00:31:54.955098 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.955268 master-0 kubenswrapper[23041]: I0308 00:31:54.955159 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.955268 master-0 kubenswrapper[23041]: I0308 00:31:54.955193 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-trusted-ca\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.955268 master-0 kubenswrapper[23041]: I0308 00:31:54.955216 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.955723 master-0 kubenswrapper[23041]: I0308 00:31:54.955640 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b22c3046-5193-4c1d-91c0-7c15745265be-config\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.955919 master-0 kubenswrapper[23041]: I0308 00:31:54.955883 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.955999 master-0 kubenswrapper[23041]: I0308 00:31:54.955951 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.956253 master-0 kubenswrapper[23041]: I0308 00:31:54.956220 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.956525 master-0 kubenswrapper[23041]: I0308 00:31:54.956492 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-metrics-server-audit-profiles\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.956713 master-0 kubenswrapper[23041]: I0308 00:31:54.956679 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.958103 master-0 kubenswrapper[23041]: I0308 00:31:54.957701 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-federate-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.958103 master-0 kubenswrapper[23041]: I0308 00:31:54.958068 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-telemeter-client-tls\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.959048 master-0 kubenswrapper[23041]: I0308 00:31:54.959022 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.960681 master-0 kubenswrapper[23041]: I0308 00:31:54.960646 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-client-certs\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.960681 master-0 kubenswrapper[23041]: I0308 00:31:54.960659 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/795e6115-95cc-4c0a-a407-e0a6f14118e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:54.960852 master-0 kubenswrapper[23041]: I0308 00:31:54.960811 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b22c3046-5193-4c1d-91c0-7c15745265be-serving-cert\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:54.960907 master-0 kubenswrapper[23041]: I0308 00:31:54.960825 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e884e46e-e520-4e0a-9f15-43d4b74af63e-cert\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:54.961005 master-0 kubenswrapper[23041]: I0308 00:31:54.960976 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-client-ca-bundle\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.961362 master-0 kubenswrapper[23041]: I0308 00:31:54.961294 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-secret-metrics-server-tls\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:54.969083 master-0 kubenswrapper[23041]: I0308 00:31:54.969036 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6xw\" (UniqueName: \"kubernetes.io/projected/7317ceda-df6f-4826-aa1a-15304c2b0fcd-kube-api-access-cw6xw\") pod \"machine-approver-754bdc9f9d-xpl2b\" (UID: \"7317ceda-df6f-4826-aa1a-15304c2b0fcd\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" Mar 08 00:31:54.992179 master-0 kubenswrapper[23041]: I0308 00:31:54.992136 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b100ce12-965e-409e-8cdb-8f99ef51a82b-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-7gtw2\" (UID: \"b100ce12-965e-409e-8cdb-8f99ef51a82b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-7gtw2" Mar 08 00:31:55.012417 master-0 kubenswrapper[23041]: I0308 00:31:55.012295 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznbf\" (UniqueName: \"kubernetes.io/projected/ae061e84-5e6a-415c-a735-fa14add7318a-kube-api-access-qznbf\") pod \"kube-state-metrics-68b88f8cb5-qjxhc\" (UID: \"ae061e84-5e6a-415c-a735-fa14add7318a\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-qjxhc" Mar 08 00:31:55.028984 master-0 kubenswrapper[23041]: I0308 00:31:55.028930 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvvvn\" (UniqueName: \"kubernetes.io/projected/b2548aca-4a9d-4670-a60a-0d6361d1c441-kube-api-access-dvvvn\") pod \"redhat-operators-9j9zs\" (UID: \"b2548aca-4a9d-4670-a60a-0d6361d1c441\") " pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:55.045879 master-0 kubenswrapper[23041]: I0308 00:31:55.045821 23041 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:31:55.045879 master-0 kubenswrapper[23041]: I0308 00:31:55.045881 23041 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:31:55.046148 master-0 kubenswrapper[23041]: I0308 00:31:55.045998 23041 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-r6nkv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:31:55.046148 master-0 kubenswrapper[23041]: I0308 00:31:55.046051 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" podUID="6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:31:55.052547 master-0 kubenswrapper[23041]: I0308 00:31:55.052507 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkh52\" (UniqueName: \"kubernetes.io/projected/1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e-kube-api-access-gkh52\") pod \"node-resolver-l9pkr\" (UID: \"1bad9e63-1aa2-44a7-aaf8-a0e82f33ad6e\") " pod="openshift-dns/node-resolver-l9pkr" Mar 08 00:31:55.057283 master-0 kubenswrapper[23041]: I0308 00:31:55.057166 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert\") pod \"monitoring-plugin-6db79546f6-gdz4k\" (UID: \"2040e5dc-b314-46a9-a61b-e80f1a046ce3\") " pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:55.069947 master-0 kubenswrapper[23041]: I0308 00:31:55.060508 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2040e5dc-b314-46a9-a61b-e80f1a046ce3-monitoring-plugin-cert\") pod \"monitoring-plugin-6db79546f6-gdz4k\" (UID: \"2040e5dc-b314-46a9-a61b-e80f1a046ce3\") " pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:55.069947 master-0 kubenswrapper[23041]: I0308 00:31:55.069248 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qshd\" (UniqueName: \"kubernetes.io/projected/1751db13-b792-43e2-8459-d1d4a0164dfb-kube-api-access-6qshd\") pod \"apiserver-85cb8cb9bb-bmx44\" (UID: \"1751db13-b792-43e2-8459-d1d4a0164dfb\") " pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:55.088043 master-0 kubenswrapper[23041]: I0308 00:31:55.088002 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxt7\" (UniqueName: \"kubernetes.io/projected/5b9f4db1-3ba9-49a5-9a65-1d770ee59a65-kube-api-access-stxt7\") pod \"openshift-state-metrics-74cc79fd76-s9b9v\" (UID: \"5b9f4db1-3ba9-49a5-9a65-1d770ee59a65\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-s9b9v" Mar 08 00:31:55.108780 master-0 kubenswrapper[23041]: I0308 00:31:55.108747 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a229b84-65bd-493b-90dd-b8194f842dc8-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-vm7rj\" (UID: \"5a229b84-65bd-493b-90dd-b8194f842dc8\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" Mar 08 00:31:55.128618 master-0 kubenswrapper[23041]: I0308 00:31:55.128581 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/03f4bafb-c270-428a-bacf-8a424b3d1a05-kube-api-access-pfdxc\") pod \"dns-operator-589895fbb7-gmvnl\" (UID: \"03f4bafb-c270-428a-bacf-8a424b3d1a05\") " pod="openshift-dns-operator/dns-operator-589895fbb7-gmvnl" Mar 08 00:31:55.155269 master-0 kubenswrapper[23041]: I0308 00:31:55.155221 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxk5x\" (UniqueName: \"kubernetes.io/projected/7ad8b9ea-ba1c-4507-9b70-ce2da170d480-kube-api-access-bxk5x\") pod \"multus-additional-cni-plugins-d5jxb\" (UID: \"7ad8b9ea-ba1c-4507-9b70-ce2da170d480\") " pod="openshift-multus/multus-additional-cni-plugins-d5jxb" Mar 08 00:31:55.175259 master-0 kubenswrapper[23041]: I0308 00:31:55.175210 23041 request.go:700] Waited for 3.938046251s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/serviceaccounts/cloud-credential-operator/token Mar 08 00:31:55.176089 master-0 kubenswrapper[23041]: I0308 00:31:55.176058 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-bound-sa-token\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:55.185125 master-0 kubenswrapper[23041]: I0308 00:31:55.185080 23041 scope.go:117] "RemoveContainer" containerID="1fdc0977a8b34be93d33d2377b4810454b6ad9c4cfeec0c8fce160478572354d" Mar 08 00:31:55.188686 master-0 kubenswrapper[23041]: I0308 00:31:55.188653 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgqmb\" (UniqueName: \"kubernetes.io/projected/e78057cd-5120-4a12-934d-9fed51e1bdc0-kube-api-access-zgqmb\") pod \"cloud-credential-operator-55d85b7b47-nrb7q\" (UID: \"e78057cd-5120-4a12-934d-9fed51e1bdc0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-nrb7q" Mar 08 00:31:55.209378 master-0 kubenswrapper[23041]: I0308 00:31:55.209312 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5d9\" (UniqueName: \"kubernetes.io/projected/7da68e85-9170-499d-8050-139ecfac4600-kube-api-access-bg5d9\") pod \"multus-dllkj\" (UID: \"7da68e85-9170-499d-8050-139ecfac4600\") " pod="openshift-multus/multus-dllkj" Mar 08 00:31:55.212312 master-0 kubenswrapper[23041]: I0308 00:31:55.212280 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:55.227942 master-0 kubenswrapper[23041]: I0308 00:31:55.227870 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4n9\" (UniqueName: \"kubernetes.io/projected/3b4f8517-1e54-4b41-ba6b-6c56fe66831a-kube-api-access-vb4n9\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-nwttq\" (UID: \"3b4f8517-1e54-4b41-ba6b-6c56fe66831a\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" Mar 08 00:31:55.271613 master-0 kubenswrapper[23041]: I0308 00:31:55.271570 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9l64\" (UniqueName: \"kubernetes.io/projected/4d0b9fbc-a1f8-4a98-99de-758734bd1a5b-kube-api-access-z9l64\") pod \"ingress-operator-677db989d6-blw5x\" (UID: \"4d0b9fbc-a1f8-4a98-99de-758734bd1a5b\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-blw5x" Mar 08 00:31:55.288910 master-0 kubenswrapper[23041]: I0308 00:31:55.288820 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggmz\" (UniqueName: \"kubernetes.io/projected/ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7-kube-api-access-2ggmz\") pod \"machine-config-daemon-k7pnc\" (UID: \"ce9b1b97-d4f1-4e1f-9a96-8b67c3fd84f7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7pnc" Mar 08 00:31:55.311842 master-0 kubenswrapper[23041]: I0308 00:31:55.308917 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9kl\" (UniqueName: \"kubernetes.io/projected/8f71fd39-a16b-47d2-b781-c8ce37bcb9b2-kube-api-access-2f9kl\") pod \"package-server-manager-854648ff6d-phgxj\" (UID: \"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:55.333059 master-0 kubenswrapper[23041]: I0308 00:31:55.333001 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh9cz\" (UniqueName: \"kubernetes.io/projected/1f63cb2f-779f-4fde-bf92-cf0414844a77-kube-api-access-wh9cz\") pod \"network-check-target-w5fjg\" (UID: \"1f63cb2f-779f-4fde-bf92-cf0414844a77\") " pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:31:55.353574 master-0 kubenswrapper[23041]: I0308 00:31:55.353487 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fstf\" (UniqueName: \"kubernetes.io/projected/ef0a3c84-98bb-4915-9010-d66fcbeafe09-kube-api-access-8fstf\") pod \"openshift-controller-manager-operator-8565d84698-49hzm\" (UID: \"ef0a3c84-98bb-4915-9010-d66fcbeafe09\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-49hzm" Mar 08 00:31:55.370540 master-0 kubenswrapper[23041]: I0308 00:31:55.370160 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrq9\" (UniqueName: \"kubernetes.io/projected/4ad37f40-c533-4a1e-882a-2e0973eff86d-kube-api-access-6wrq9\") pod \"olm-operator-d64cfc9db-8qtmf\" (UID: \"4ad37f40-c533-4a1e-882a-2e0973eff86d\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:55.396993 master-0 kubenswrapper[23041]: I0308 00:31:55.396636 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9fl\" (UniqueName: \"kubernetes.io/projected/e97435ee-522e-427d-9efc-40bc3d2b0d02-kube-api-access-bv9fl\") pod \"csi-snapshot-controller-7577d6f48-vd52m\" (UID: \"e97435ee-522e-427d-9efc-40bc3d2b0d02\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" Mar 08 00:31:55.409351 master-0 kubenswrapper[23041]: I0308 00:31:55.409300 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6999cf38-e317-4727-98c9-d4e348e9e16a-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-k7dp2\" (UID: \"6999cf38-e317-4727-98c9-d4e348e9e16a\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-k7dp2" Mar 08 00:31:55.428282 master-0 kubenswrapper[23041]: I0308 00:31:55.428227 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll99v\" (UniqueName: \"kubernetes.io/projected/a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b-kube-api-access-ll99v\") pod \"community-operators-6t5lg\" (UID: \"a63fbd8b-bad8-49fb-b2f5-917a2ea47b3b\") " pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:55.447873 master-0 kubenswrapper[23041]: I0308 00:31:55.447790 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkqn\" (UniqueName: \"kubernetes.io/projected/0e52cbdc-1d46-4cc9-85ee-535aa449992f-kube-api-access-xqkqn\") pod \"iptables-alerter-rfnqf\" (UID: \"0e52cbdc-1d46-4cc9-85ee-535aa449992f\") " pod="openshift-network-operator/iptables-alerter-rfnqf" Mar 08 00:31:55.473287 master-0 kubenswrapper[23041]: I0308 00:31:55.473226 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"route-controller-manager-544c885f6d-dr4gh\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:55.494426 master-0 kubenswrapper[23041]: I0308 00:31:55.494366 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvglb\" (UniqueName: \"kubernetes.io/projected/786e30f1-d30a-43e1-85cb-d8ea1495422e-kube-api-access-dvglb\") pod \"network-check-source-7c67b67d47-sctv9\" (UID: \"786e30f1-d30a-43e1-85cb-d8ea1495422e\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-sctv9" Mar 08 00:31:55.509767 master-0 kubenswrapper[23041]: I0308 00:31:55.509127 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e76bc134-2a88-4f92-9aa7-f6854941b98f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-bh886\" (UID: \"e76bc134-2a88-4f92-9aa7-f6854941b98f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-bh886" Mar 08 00:31:55.530287 master-0 kubenswrapper[23041]: I0308 00:31:55.528135 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:55.551931 master-0 kubenswrapper[23041]: I0308 00:31:55.551881 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5knc\" (UniqueName: \"kubernetes.io/projected/2fbed2b8-f4c5-4f52-b29c-1907a2034f6f-kube-api-access-d5knc\") pod \"etcd-operator-5884b9cd56-27phk\" (UID: \"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" Mar 08 00:31:55.572846 master-0 kubenswrapper[23041]: I0308 00:31:55.572788 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"controller-manager-5b4bdf67b6-8rdjs\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:55.586805 master-0 kubenswrapper[23041]: I0308 00:31:55.586757 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz874\" (UniqueName: \"kubernetes.io/projected/9d810f7f-258a-47ce-9f99-7b1d93388aee-kube-api-access-dz874\") pod \"machine-config-operator-fdb5c78b5-5nbfk\" (UID: \"9d810f7f-258a-47ce-9f99-7b1d93388aee\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" Mar 08 00:31:55.616059 master-0 kubenswrapper[23041]: I0308 00:31:55.616006 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncncc\" (UniqueName: \"kubernetes.io/projected/a68ad726-392e-4a7a-a384-409108df9c8b-kube-api-access-ncncc\") pod \"machine-config-server-wkt98\" (UID: \"a68ad726-392e-4a7a-a384-409108df9c8b\") " pod="openshift-machine-config-operator/machine-config-server-wkt98" Mar 08 00:31:55.636694 master-0 kubenswrapper[23041]: I0308 00:31:55.636645 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqn9\" (UniqueName: \"kubernetes.io/projected/401bbef2-684c-4f55-b2c7-e6184c789e40-kube-api-access-mcqn9\") pod \"tuned-67jx5\" (UID: \"401bbef2-684c-4f55-b2c7-e6184c789e40\") " pod="openshift-cluster-node-tuning-operator/tuned-67jx5" Mar 08 00:31:55.637592 master-0 kubenswrapper[23041]: W0308 00:31:55.637548 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2040e5dc_b314_46a9_a61b_e80f1a046ce3.slice/crio-77a356e3b4891ba56b47dd96fd3f64dc7520220834f327f99b1a107e9fc05b6c WatchSource:0}: Error finding container 77a356e3b4891ba56b47dd96fd3f64dc7520220834f327f99b1a107e9fc05b6c: Status 404 returned error can't find the container with id 77a356e3b4891ba56b47dd96fd3f64dc7520220834f327f99b1a107e9fc05b6c Mar 08 00:31:55.640491 master-0 kubenswrapper[23041]: I0308 00:31:55.640454 23041 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:31:55.654334 master-0 kubenswrapper[23041]: I0308 00:31:55.654170 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4xz\" (UniqueName: \"kubernetes.io/projected/2ac55f03-dd6f-4ead-bacc-c69aeca146dc-kube-api-access-8d4xz\") pod \"migrator-57ccdf9b5-tbcsh\" (UID: \"2ac55f03-dd6f-4ead-bacc-c69aeca146dc\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-tbcsh" Mar 08 00:31:55.670597 master-0 kubenswrapper[23041]: I0308 00:31:55.670539 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:55.688320 master-0 kubenswrapper[23041]: I0308 00:31:55.688275 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2h6\" (UniqueName: \"kubernetes.io/projected/1bb8fea7-71ca-43a3-839d-9c1459bf8dfa-kube-api-access-gh2h6\") pod \"operator-controller-controller-manager-6598bfb6c4-7nhvs\" (UID: \"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:55.709155 master-0 kubenswrapper[23041]: I0308 00:31:55.708903 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnrc\" (UniqueName: \"kubernetes.io/projected/e3f42081-387d-4798-b981-ac232e851bb4-kube-api-access-smnrc\") pod \"cluster-samples-operator-664cb58b85-8lf4q\" (UID: \"e3f42081-387d-4798-b981-ac232e851bb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-8lf4q" Mar 08 00:31:55.728879 master-0 kubenswrapper[23041]: I0308 00:31:55.728822 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpkj\" (UniqueName: \"kubernetes.io/projected/c2ce2ea7-bd25-4294-8f3a-11ce53577830-kube-api-access-9qpkj\") pod \"service-ca-operator-69b6fc6b88-p8hlq\" (UID: \"c2ce2ea7-bd25-4294-8f3a-11ce53577830\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" Mar 08 00:31:55.758296 master-0 kubenswrapper[23041]: I0308 00:31:55.758263 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllt8\" (UniqueName: \"kubernetes.io/projected/24ef1fb7-c8a1-4b50-b89f-2a81848ebb25-kube-api-access-wllt8\") pod \"node-exporter-bx9dn\" (UID: \"24ef1fb7-c8a1-4b50-b89f-2a81848ebb25\") " pod="openshift-monitoring/node-exporter-bx9dn" Mar 08 00:31:55.767305 master-0 kubenswrapper[23041]: I0308 00:31:55.767263 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jml\" (UniqueName: \"kubernetes.io/projected/3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab-kube-api-access-44jml\") pod \"openshift-apiserver-operator-799b6db4d7-rj9cl\" (UID: \"3cc3e3a1-57ce-4806-a5c7-ccfbd96ad5ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-rj9cl" Mar 08 00:31:55.789503 master-0 kubenswrapper[23041]: I0308 00:31:55.789360 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntks9\" (UniqueName: \"kubernetes.io/projected/3fee96d7-75a7-46e4-9707-7bd292f10b84-kube-api-access-ntks9\") pod \"ovnkube-control-plane-66b55d57d-m77x2\" (UID: \"3fee96d7-75a7-46e4-9707-7bd292f10b84\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" Mar 08 00:31:55.810935 master-0 kubenswrapper[23041]: I0308 00:31:55.810873 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5hl\" (UniqueName: \"kubernetes.io/projected/614f0a0f-5853-4cf6-bd3d-174141f0f1e2-kube-api-access-8v5hl\") pod \"insights-operator-8f89dfddd-brq9l\" (UID: \"614f0a0f-5853-4cf6-bd3d-174141f0f1e2\") " pod="openshift-insights/insights-operator-8f89dfddd-brq9l" Mar 08 00:31:55.828944 master-0 kubenswrapper[23041]: I0308 00:31:55.828894 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljwf\" (UniqueName: \"kubernetes.io/projected/55c8d406-5448-4056-ab3c-c8399217c024-kube-api-access-nljwf\") pod \"redhat-marketplace-4fjw9\" (UID: \"55c8d406-5448-4056-ab3c-c8399217c024\") " pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:55.850013 master-0 kubenswrapper[23041]: I0308 00:31:55.849948 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhckc\" (UniqueName: \"kubernetes.io/projected/58333089-2456-4a25-8ba7-6d557eefa177-kube-api-access-hhckc\") pod \"authentication-operator-7c6989d6c4-dkqc4\" (UID: \"58333089-2456-4a25-8ba7-6d557eefa177\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-dkqc4" Mar 08 00:31:55.880536 master-0 kubenswrapper[23041]: I0308 00:31:55.880475 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9pm\" (UniqueName: \"kubernetes.io/projected/d01c21a1-6c2c-49a7-9d85-254662851838-kube-api-access-rt9pm\") pod \"catalogd-controller-manager-7f8b8b6f4c-w2q2q\" (UID: \"d01c21a1-6c2c-49a7-9d85-254662851838\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:55.895343 master-0 kubenswrapper[23041]: I0308 00:31:55.895298 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfqt\" (UniqueName: \"kubernetes.io/projected/6d770808-d390-41c1-a9d9-fc12b99fa9a9-kube-api-access-6rfqt\") pod \"cluster-monitoring-operator-674cbfbd9d-cxs8s\" (UID: \"6d770808-d390-41c1-a9d9-fc12b99fa9a9\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-cxs8s" Mar 08 00:31:55.908248 master-0 kubenswrapper[23041]: I0308 00:31:55.908179 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88m9\" (UniqueName: \"kubernetes.io/projected/5cf5a2ef-2498-40a0-a189-0753076fd3b6-kube-api-access-k88m9\") pod \"marketplace-operator-64bf9778cb-mgb5v\" (UID: \"5cf5a2ef-2498-40a0-a189-0753076fd3b6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:55.933766 master-0 kubenswrapper[23041]: I0308 00:31:55.933723 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxq9\" (UniqueName: \"kubernetes.io/projected/56e11e7e-6946-4e11-bce9-e91a721fe4a7-kube-api-access-kmxq9\") pod \"certified-operators-9nqqp\" (UID: \"56e11e7e-6946-4e11-bce9-e91a721fe4a7\") " pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:55.956587 master-0 kubenswrapper[23041]: I0308 00:31:55.950126 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65c2\" (UniqueName: \"kubernetes.io/projected/2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0-kube-api-access-h65c2\") pod \"machine-config-controller-ff46b7bdf-z5fkp\" (UID: \"2b20c0f9-a6d7-47e7-af0b-f8ea126ef7a0\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-z5fkp" Mar 08 00:31:55.972217 master-0 kubenswrapper[23041]: I0308 00:31:55.972156 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbdd4\" (UniqueName: \"kubernetes.io/projected/1abf904b-0b8d-4d61-8231-0e8d00933192-kube-api-access-dbdd4\") pod \"cluster-node-tuning-operator-66c7586884-9vjl9\" (UID: \"1abf904b-0b8d-4d61-8231-0e8d00933192\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vjl9" Mar 08 00:31:55.987471 master-0 kubenswrapper[23041]: I0308 00:31:55.987418 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhx4\" (UniqueName: \"kubernetes.io/projected/c7097f64-1709-4f76-a725-5a6c6cc5919b-kube-api-access-zvhx4\") pod \"machine-api-operator-84bf6db4f9-bncfj\" (UID: \"c7097f64-1709-4f76-a725-5a6c6cc5919b\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-bncfj" Mar 08 00:31:56.007988 master-0 kubenswrapper[23041]: I0308 00:31:56.007936 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wj5\" (UniqueName: \"kubernetes.io/projected/db164b32-e20e-4d07-a9ae-98720321621d-kube-api-access-89wj5\") pod \"cluster-olm-operator-77899cf6d-r9zcq\" (UID: \"db164b32-e20e-4d07-a9ae-98720321621d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-r9zcq" Mar 08 00:31:56.030313 master-0 kubenswrapper[23041]: I0308 00:31:56.030272 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9bq\" (UniqueName: \"kubernetes.io/projected/e884e46e-e520-4e0a-9f15-43d4b74af63e-kube-api-access-wg9bq\") pod \"ingress-canary-5qffz\" (UID: \"e884e46e-e520-4e0a-9f15-43d4b74af63e\") " pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:56.051516 master-0 kubenswrapper[23041]: I0308 00:31:56.051416 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jd2n\" (UniqueName: \"kubernetes.io/projected/b22c3046-5193-4c1d-91c0-7c15745265be-kube-api-access-2jd2n\") pod \"console-operator-6c7fb6b958-db7d8\" (UID: \"b22c3046-5193-4c1d-91c0-7c15745265be\") " pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:56.071036 master-0 kubenswrapper[23041]: I0308 00:31:56.070993 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc78c\" (UniqueName: \"kubernetes.io/projected/795e6115-95cc-4c0a-a407-e0a6f14118e5-kube-api-access-kc78c\") pod \"telemeter-client-6cfc594d97-x62fk\" (UID: \"795e6115-95cc-4c0a-a407-e0a6f14118e5\") " pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:56.089321 master-0 kubenswrapper[23041]: E0308 00:31:56.089277 23041 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.089321 master-0 kubenswrapper[23041]: E0308 00:31:56.089315 23041 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.089538 master-0 kubenswrapper[23041]: E0308 00:31:56.089371 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access podName:66915251-1fdd-40f3-a59b-054776b214df nodeName:}" failed. No retries permitted until 2026-03-08 00:31:56.589351739 +0000 UTC m=+22.062188293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "66915251-1fdd-40f3-a59b-054776b214df") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.092121 master-0 kubenswrapper[23041]: I0308 00:31:56.092086 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" Mar 08 00:31:56.121455 master-0 kubenswrapper[23041]: I0308 00:31:56.121404 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4t5k\" (UniqueName: \"kubernetes.io/projected/c26f36ee-5dd4-40b7-8cb9-7f4835f120fd-kube-api-access-r4t5k\") pod \"metrics-server-7b45f5889c-z48tj\" (UID: \"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd\") " pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: E0308 00:31:56.138474 23041 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.331s" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138516 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138546 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138583 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138616 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-8jr6f" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138628 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k"] Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138641 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138661 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138680 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138690 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:56.140666 master-0 kubenswrapper[23041]: I0308 00:31:56.138709 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jfjzg" Mar 08 00:31:56.153005 master-0 kubenswrapper[23041]: I0308 00:31:56.152928 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 00:31:56.179395 master-0 kubenswrapper[23041]: I0308 00:31:56.179223 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 00:31:56.179395 master-0 kubenswrapper[23041]: I0308 00:31:56.179267 23041 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="8b087322-b76a-4293-8e6b-786c5f01f37f" Mar 08 00:31:56.179576 master-0 kubenswrapper[23041]: I0308 00:31:56.179549 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 00:31:56.179618 master-0 kubenswrapper[23041]: I0308 00:31:56.179574 23041 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="8b087322-b76a-4293-8e6b-786c5f01f37f" Mar 08 00:31:56.179618 master-0 kubenswrapper[23041]: I0308 00:31:56.179594 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:56.179701 master-0 kubenswrapper[23041]: I0308 00:31:56.179685 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:56.179761 master-0 kubenswrapper[23041]: I0308 00:31:56.179748 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:56.179834 master-0 kubenswrapper[23041]: I0308 00:31:56.179821 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:56.179872 master-0 kubenswrapper[23041]: I0308 00:31:56.179849 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vnl28" Mar 08 00:31:56.179918 master-0 kubenswrapper[23041]: I0308 00:31:56.179892 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:56.179956 master-0 kubenswrapper[23041]: I0308 00:31:56.179927 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:56.180026 master-0 kubenswrapper[23041]: I0308 00:31:56.179992 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:56.180026 master-0 kubenswrapper[23041]: I0308 00:31:56.180017 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:56.180094 master-0 kubenswrapper[23041]: I0308 00:31:56.180065 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:56.180094 master-0 kubenswrapper[23041]: I0308 00:31:56.180077 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:56.180154 master-0 kubenswrapper[23041]: I0308 00:31:56.180095 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:56.180154 master-0 kubenswrapper[23041]: I0308 00:31:56.180107 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:56.180154 master-0 kubenswrapper[23041]: I0308 00:31:56.180123 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-9c44c86f9-rplwv" Mar 08 00:31:56.180154 master-0 kubenswrapper[23041]: I0308 00:31:56.180151 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:56.180362 master-0 kubenswrapper[23041]: I0308 00:31:56.180180 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:31:56.180362 master-0 kubenswrapper[23041]: I0308 00:31:56.180284 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:31:56.180362 master-0 kubenswrapper[23041]: I0308 00:31:56.180308 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:56.180362 master-0 kubenswrapper[23041]: I0308 00:31:56.180327 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:31:56.180473 master-0 kubenswrapper[23041]: I0308 00:31:56.180376 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:31:56.180473 master-0 kubenswrapper[23041]: I0308 00:31:56.180400 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:56.180473 master-0 kubenswrapper[23041]: I0308 00:31:56.180417 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w5fjg" Mar 08 00:31:56.180473 master-0 kubenswrapper[23041]: I0308 00:31:56.180442 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:56.180473 master-0 kubenswrapper[23041]: I0308 00:31:56.180462 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180480 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180507 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180539 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180559 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180580 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:56.180610 master-0 kubenswrapper[23041]: I0308 00:31:56.180598 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:56.180789 master-0 kubenswrapper[23041]: I0308 00:31:56.180616 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:56.180789 master-0 kubenswrapper[23041]: I0308 00:31:56.180635 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8qtmf" Mar 08 00:31:56.180789 master-0 kubenswrapper[23041]: I0308 00:31:56.180652 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:31:56.180789 master-0 kubenswrapper[23041]: I0308 00:31:56.180670 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:31:56.181545 master-0 kubenswrapper[23041]: I0308 00:31:56.180991 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:31:56.188465 master-0 kubenswrapper[23041]: I0308 00:31:56.188420 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:31:56.188604 master-0 kubenswrapper[23041]: I0308 00:31:56.188490 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:31:56.188604 master-0 kubenswrapper[23041]: I0308 00:31:56.188526 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:31:56.188821 master-0 kubenswrapper[23041]: I0308 00:31:56.188796 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:31:56.202131 master-0 kubenswrapper[23041]: I0308 00:31:56.201984 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" event={"ID":"2040e5dc-b314-46a9-a61b-e80f1a046ce3","Type":"ContainerStarted","Data":"77a356e3b4891ba56b47dd96fd3f64dc7520220834f327f99b1a107e9fc05b6c"} Mar 08 00:31:56.214699 master-0 kubenswrapper[23041]: I0308 00:31:56.213476 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" event={"ID":"6aa7c49e-2a6e-4a4c-aa1e-e912eedd81c6","Type":"ContainerStarted","Data":"b89c3ad415e652143fd33efb656b878092a86441799adba60adccf36532447f1"} Mar 08 00:31:56.214699 master-0 kubenswrapper[23041]: I0308 00:31:56.214583 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:31:56.234981 master-0 kubenswrapper[23041]: I0308 00:31:56.234704 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:31:56.317290 master-0 kubenswrapper[23041]: I0308 00:31:56.314960 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5qffz" Mar 08 00:31:56.338430 master-0 kubenswrapper[23041]: I0308 00:31:56.338388 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:31:56.339151 master-0 kubenswrapper[23041]: I0308 00:31:56.339125 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:31:56.359478 master-0 kubenswrapper[23041]: I0308 00:31:56.357804 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:31:56.387328 master-0 kubenswrapper[23041]: I0308 00:31:56.386820 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:31:56.391119 master-0 kubenswrapper[23041]: E0308 00:31:56.388981 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66915251-1fdd-40f3-a59b-054776b214df" containerName="installer" Mar 08 00:31:56.391119 master-0 kubenswrapper[23041]: I0308 00:31:56.389044 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="66915251-1fdd-40f3-a59b-054776b214df" containerName="installer" Mar 08 00:31:56.391119 master-0 kubenswrapper[23041]: I0308 00:31:56.389329 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="66915251-1fdd-40f3-a59b-054776b214df" containerName="installer" Mar 08 00:31:56.423331 master-0 kubenswrapper[23041]: I0308 00:31:56.420128 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:31:56.423331 master-0 kubenswrapper[23041]: I0308 00:31:56.420453 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.454766 master-0 kubenswrapper[23041]: I0308 00:31:56.454690 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:31:56.463756 master-0 kubenswrapper[23041]: I0308 00:31:56.463706 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 00:31:56.465193 master-0 kubenswrapper[23041]: I0308 00:31:56.465151 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:31:56.477333 master-0 kubenswrapper[23041]: I0308 00:31:56.477281 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502456 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502498 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502524 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502540 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502561 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502589 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502645 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502672 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502691 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqpt\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502709 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502725 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.504241 master-0 kubenswrapper[23041]: I0308 00:31:56.502748 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.513654 master-0 kubenswrapper[23041]: I0308 00:31:56.513609 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 00:31:56.525543 master-0 kubenswrapper[23041]: I0308 00:31:56.523357 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 00:31:56.537165 master-0 kubenswrapper[23041]: I0308 00:31:56.536807 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-fsd5q" Mar 08 00:31:56.558414 master-0 kubenswrapper[23041]: I0308 00:31:56.558278 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 00:31:56.572538 master-0 kubenswrapper[23041]: I0308 00:31:56.570386 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6cfc594d97-x62fk"] Mar 08 00:31:56.580884 master-0 kubenswrapper[23041]: I0308 00:31:56.578762 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 00:31:56.596807 master-0 kubenswrapper[23041]: I0308 00:31:56.596766 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604464 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqpt\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604543 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604571 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604609 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604650 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604673 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604702 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.604721 master-0 kubenswrapper[23041]: I0308 00:31:56.604727 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.605144 master-0 kubenswrapper[23041]: I0308 00:31:56.604756 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.605144 master-0 kubenswrapper[23041]: I0308 00:31:56.604793 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.605144 master-0 kubenswrapper[23041]: I0308 00:31:56.604830 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"66915251-1fdd-40f3-a59b-054776b214df\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 00:31:56.605144 master-0 kubenswrapper[23041]: I0308 00:31:56.604955 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.605144 master-0 kubenswrapper[23041]: I0308 00:31:56.605004 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.607406 master-0 kubenswrapper[23041]: E0308 00:31:56.606019 23041 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.607406 master-0 kubenswrapper[23041]: E0308 00:31:56.606057 23041 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.607406 master-0 kubenswrapper[23041]: E0308 00:31:56.606112 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access podName:66915251-1fdd-40f3-a59b-054776b214df nodeName:}" failed. No retries permitted until 2026-03-08 00:31:57.606091385 +0000 UTC m=+23.078927939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "66915251-1fdd-40f3-a59b-054776b214df") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 00:31:56.607910 master-0 kubenswrapper[23041]: I0308 00:31:56.607098 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.608351 master-0 kubenswrapper[23041]: I0308 00:31:56.608313 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.610266 master-0 kubenswrapper[23041]: I0308 00:31:56.610234 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.617888 master-0 kubenswrapper[23041]: I0308 00:31:56.613909 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.617888 master-0 kubenswrapper[23041]: I0308 00:31:56.614341 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.617888 master-0 kubenswrapper[23041]: I0308 00:31:56.615889 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.623664 master-0 kubenswrapper[23041]: I0308 00:31:56.623465 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.624384 master-0 kubenswrapper[23041]: I0308 00:31:56.624355 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.626132 master-0 kubenswrapper[23041]: I0308 00:31:56.626092 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.626562 master-0 kubenswrapper[23041]: I0308 00:31:56.626521 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 00:31:56.633360 master-0 kubenswrapper[23041]: I0308 00:31:56.632770 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.633523 master-0 kubenswrapper[23041]: I0308 00:31:56.633482 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.674693 master-0 kubenswrapper[23041]: I0308 00:31:56.674650 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqpt\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt\") pod \"alertmanager-main-0\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.779177 master-0 kubenswrapper[23041]: I0308 00:31:56.779092 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:31:56.787046 master-0 kubenswrapper[23041]: I0308 00:31:56.787004 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 00:31:56.788026 master-0 kubenswrapper[23041]: I0308 00:31:56.788001 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:56.802442 master-0 kubenswrapper[23041]: I0308 00:31:56.800754 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 00:31:56.906658 master-0 kubenswrapper[23041]: I0308 00:31:56.906567 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-db7d8"] Mar 08 00:31:56.914588 master-0 kubenswrapper[23041]: W0308 00:31:56.914451 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb22c3046_5193_4c1d_91c0_7c15745265be.slice/crio-893318ee132d49d8f7df7056d178f6bb6bd0d6901143b29c56a56fd719042a1f WatchSource:0}: Error finding container 893318ee132d49d8f7df7056d178f6bb6bd0d6901143b29c56a56fd719042a1f: Status 404 returned error can't find the container with id 893318ee132d49d8f7df7056d178f6bb6bd0d6901143b29c56a56fd719042a1f Mar 08 00:31:56.918811 master-0 kubenswrapper[23041]: I0308 00:31:56.918770 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:56.919787 master-0 kubenswrapper[23041]: I0308 00:31:56.919760 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:31:56.921095 master-0 kubenswrapper[23041]: I0308 00:31:56.920132 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:56.921095 master-0 kubenswrapper[23041]: I0308 00:31:56.920560 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:56.921095 master-0 kubenswrapper[23041]: E0308 00:31:56.920660 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:31:56.921095 master-0 kubenswrapper[23041]: E0308 00:31:56.920724 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:32:00.920704216 +0000 UTC m=+26.393540770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:31:57.011837 master-0 kubenswrapper[23041]: I0308 00:31:57.009867 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5qffz"] Mar 08 00:31:57.018103 master-0 kubenswrapper[23041]: I0308 00:31:57.017887 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b45f5889c-z48tj"] Mar 08 00:31:57.020129 master-0 kubenswrapper[23041]: W0308 00:31:57.020092 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode884e46e_e520_4e0a_9f15_43d4b74af63e.slice/crio-72644d8a366c2e21c1771d55bf9802c22c5d6540af2b3968387026078b9fd4ee WatchSource:0}: Error finding container 72644d8a366c2e21c1771d55bf9802c22c5d6540af2b3968387026078b9fd4ee: Status 404 returned error can't find the container with id 72644d8a366c2e21c1771d55bf9802c22c5d6540af2b3968387026078b9fd4ee Mar 08 00:31:57.021932 master-0 kubenswrapper[23041]: I0308 00:31:57.021889 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.022027 master-0 kubenswrapper[23041]: I0308 00:31:57.021960 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.022358 master-0 kubenswrapper[23041]: I0308 00:31:57.022303 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.022435 master-0 kubenswrapper[23041]: I0308 00:31:57.022397 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.022491 master-0 kubenswrapper[23041]: I0308 00:31:57.022471 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.028112 master-0 kubenswrapper[23041]: W0308 00:31:57.028074 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26f36ee_5dd4_40b7_8cb9_7f4835f120fd.slice/crio-2146e3628c3b289a05626da4bd106905bf2d803f0f99577e48cff1ca642d3b40 WatchSource:0}: Error finding container 2146e3628c3b289a05626da4bd106905bf2d803f0f99577e48cff1ca642d3b40: Status 404 returned error can't find the container with id 2146e3628c3b289a05626da4bd106905bf2d803f0f99577e48cff1ca642d3b40 Mar 08 00:31:57.046123 master-0 kubenswrapper[23041]: I0308 00:31:57.045773 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:57.051791 master-0 kubenswrapper[23041]: I0308 00:31:57.050409 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:57.088048 master-0 kubenswrapper[23041]: I0308 00:31:57.087468 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.124881 master-0 kubenswrapper[23041]: I0308 00:31:57.124843 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:31:57.245760 master-0 kubenswrapper[23041]: I0308 00:31:57.245185 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:31:57.263130 master-0 kubenswrapper[23041]: W0308 00:31:57.263082 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c724a5_de89_4fde_b596_70157d2d19b6.slice/crio-c8e2a3844909fda8d886c3f9a2f45898e07fcb80d21afe699dd2bc976f511106 WatchSource:0}: Error finding container c8e2a3844909fda8d886c3f9a2f45898e07fcb80d21afe699dd2bc976f511106: Status 404 returned error can't find the container with id c8e2a3844909fda8d886c3f9a2f45898e07fcb80d21afe699dd2bc976f511106 Mar 08 00:31:57.276663 master-0 kubenswrapper[23041]: I0308 00:31:57.272264 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" event={"ID":"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd","Type":"ContainerStarted","Data":"2146e3628c3b289a05626da4bd106905bf2d803f0f99577e48cff1ca642d3b40"} Mar 08 00:31:57.281296 master-0 kubenswrapper[23041]: I0308 00:31:57.281258 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" event={"ID":"795e6115-95cc-4c0a-a407-e0a6f14118e5","Type":"ContainerStarted","Data":"70a2b9ee50ec271466c9ac6f8251185879005d0116c70819eb39440abe033c87"} Mar 08 00:31:57.282584 master-0 kubenswrapper[23041]: I0308 00:31:57.282558 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" event={"ID":"b22c3046-5193-4c1d-91c0-7c15745265be","Type":"ContainerStarted","Data":"893318ee132d49d8f7df7056d178f6bb6bd0d6901143b29c56a56fd719042a1f"} Mar 08 00:31:57.284456 master-0 kubenswrapper[23041]: I0308 00:31:57.284412 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5qffz" event={"ID":"e884e46e-e520-4e0a-9f15-43d4b74af63e","Type":"ContainerStarted","Data":"72644d8a366c2e21c1771d55bf9802c22c5d6540af2b3968387026078b9fd4ee"} Mar 08 00:31:57.286675 master-0 kubenswrapper[23041]: I0308 00:31:57.286193 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="4a829558-a672-4dc5-ae20-69884213482f" containerName="installer" containerID="cri-o://75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17" gracePeriod=30 Mar 08 00:31:57.287539 master-0 kubenswrapper[23041]: I0308 00:31:57.287432 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:57.287539 master-0 kubenswrapper[23041]: I0308 00:31:57.287474 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-r6nkv" Mar 08 00:31:57.321145 master-0 kubenswrapper[23041]: I0308 00:31:57.305900 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:31:57.321145 master-0 kubenswrapper[23041]: I0308 00:31:57.312003 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 00:31:57.428243 master-0 kubenswrapper[23041]: I0308 00:31:57.428155 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66915251-1fdd-40f3-a59b-054776b214df-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:57.552484 master-0 kubenswrapper[23041]: I0308 00:31:57.552422 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5cd89459d5-wwnjs"] Mar 08 00:31:57.556348 master-0 kubenswrapper[23041]: I0308 00:31:57.556002 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.579872 master-0 kubenswrapper[23041]: I0308 00:31:57.579595 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd89459d5-wwnjs"] Mar 08 00:31:57.582818 master-0 kubenswrapper[23041]: I0308 00:31:57.582774 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 00:31:57.604945 master-0 kubenswrapper[23041]: I0308 00:31:57.604716 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-94fb4" Mar 08 00:31:57.623137 master-0 kubenswrapper[23041]: I0308 00:31:57.622564 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 00:31:57.624314 master-0 kubenswrapper[23041]: I0308 00:31:57.624267 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 00:31:57.631438 master-0 kubenswrapper[23041]: I0308 00:31:57.631380 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631438 master-0 kubenswrapper[23041]: I0308 00:31:57.631429 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1aec9660-eaf0-48c1-8d83-1a89982f9804-metrics-client-ca\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631680 master-0 kubenswrapper[23041]: I0308 00:31:57.631453 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631680 master-0 kubenswrapper[23041]: I0308 00:31:57.631479 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631680 master-0 kubenswrapper[23041]: I0308 00:31:57.631496 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631680 master-0 kubenswrapper[23041]: I0308 00:31:57.631521 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5xmh\" (UniqueName: \"kubernetes.io/projected/1aec9660-eaf0-48c1-8d83-1a89982f9804-kube-api-access-b5xmh\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631680 master-0 kubenswrapper[23041]: I0308 00:31:57.631627 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-grpc-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.631825 master-0 kubenswrapper[23041]: I0308 00:31:57.631781 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.640838 master-0 kubenswrapper[23041]: I0308 00:31:57.638884 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 00:31:57.656166 master-0 kubenswrapper[23041]: I0308 00:31:57.656113 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-2m7s0hn4nptd" Mar 08 00:31:57.676017 master-0 kubenswrapper[23041]: I0308 00:31:57.675971 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 00:31:57.698845 master-0 kubenswrapper[23041]: I0308 00:31:57.698278 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 00:31:57.733118 master-0 kubenswrapper[23041]: I0308 00:31:57.733051 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.733118 master-0 kubenswrapper[23041]: I0308 00:31:57.733116 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5xmh\" (UniqueName: \"kubernetes.io/projected/1aec9660-eaf0-48c1-8d83-1a89982f9804-kube-api-access-b5xmh\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.733396 master-0 kubenswrapper[23041]: I0308 00:31:57.733364 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-grpc-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.733526 master-0 kubenswrapper[23041]: I0308 00:31:57.733489 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.735010 master-0 kubenswrapper[23041]: I0308 00:31:57.733666 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.735010 master-0 kubenswrapper[23041]: I0308 00:31:57.733936 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1aec9660-eaf0-48c1-8d83-1a89982f9804-metrics-client-ca\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.735010 master-0 kubenswrapper[23041]: I0308 00:31:57.734016 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.735010 master-0 kubenswrapper[23041]: I0308 00:31:57.734080 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.735010 master-0 kubenswrapper[23041]: I0308 00:31:57.734637 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1aec9660-eaf0-48c1-8d83-1a89982f9804-metrics-client-ca\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.736049 master-0 kubenswrapper[23041]: I0308 00:31:57.736017 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.737641 master-0 kubenswrapper[23041]: I0308 00:31:57.737599 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.739935 master-0 kubenswrapper[23041]: I0308 00:31:57.739889 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-grpc-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.740721 master-0 kubenswrapper[23041]: I0308 00:31:57.740679 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-tls\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.741770 master-0 kubenswrapper[23041]: I0308 00:31:57.741740 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.749702 master-0 kubenswrapper[23041]: I0308 00:31:57.748730 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1aec9660-eaf0-48c1-8d83-1a89982f9804-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.779896 master-0 kubenswrapper[23041]: I0308 00:31:57.779824 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5xmh\" (UniqueName: \"kubernetes.io/projected/1aec9660-eaf0-48c1-8d83-1a89982f9804-kube-api-access-b5xmh\") pod \"thanos-querier-5cd89459d5-wwnjs\" (UID: \"1aec9660-eaf0-48c1-8d83-1a89982f9804\") " pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:57.887296 master-0 kubenswrapper[23041]: I0308 00:31:57.887119 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:31:58.293194 master-0 kubenswrapper[23041]: I0308 00:31:58.293152 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"48fcdccb-478e-4027-b4b9-9a061439f0e2","Type":"ContainerStarted","Data":"9048ec3f692e19da41bf1e5b754fc6525e0e4cd99b83b30eb7a20ba35e882ab7"} Mar 08 00:31:58.298044 master-0 kubenswrapper[23041]: I0308 00:31:58.297993 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"c8e2a3844909fda8d886c3f9a2f45898e07fcb80d21afe699dd2bc976f511106"} Mar 08 00:31:58.302385 master-0 kubenswrapper[23041]: I0308 00:31:58.300187 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5qffz" event={"ID":"e884e46e-e520-4e0a-9f15-43d4b74af63e","Type":"ContainerStarted","Data":"287aac1180259f9bd2c3d5e6cd63d192c79d3bff4cf5a26dd89a151c4e852ab0"} Mar 08 00:31:58.306319 master-0 kubenswrapper[23041]: I0308 00:31:58.306147 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_4a829558-a672-4dc5-ae20-69884213482f/installer/0.log" Mar 08 00:31:58.306319 master-0 kubenswrapper[23041]: I0308 00:31:58.306219 23041 generic.go:334] "Generic (PLEG): container finished" podID="4a829558-a672-4dc5-ae20-69884213482f" containerID="75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17" exitCode=1 Mar 08 00:31:58.306561 master-0 kubenswrapper[23041]: I0308 00:31:58.306363 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerDied","Data":"75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17"} Mar 08 00:31:58.328921 master-0 kubenswrapper[23041]: I0308 00:31:58.328828 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" event={"ID":"c26f36ee-5dd4-40b7-8cb9-7f4835f120fd","Type":"ContainerStarted","Data":"2af089c60faa314f2de9990a5eb09ea5d14fe4926436c88807d71224e105b163"} Mar 08 00:31:58.646359 master-0 kubenswrapper[23041]: I0308 00:31:58.645605 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd89459d5-wwnjs"] Mar 08 00:31:58.666418 master-0 kubenswrapper[23041]: W0308 00:31:58.664540 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aec9660_eaf0_48c1_8d83_1a89982f9804.slice/crio-e996210f21346953b3c04588461f0b36b127888fe46d29566867f66f8d523ba4 WatchSource:0}: Error finding container e996210f21346953b3c04588461f0b36b127888fe46d29566867f66f8d523ba4: Status 404 returned error can't find the container with id e996210f21346953b3c04588461f0b36b127888fe46d29566867f66f8d523ba4 Mar 08 00:31:58.707290 master-0 kubenswrapper[23041]: I0308 00:31:58.707239 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_4a829558-a672-4dc5-ae20-69884213482f/installer/0.log" Mar 08 00:31:58.707511 master-0 kubenswrapper[23041]: I0308 00:31:58.707331 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:58.764537 master-0 kubenswrapper[23041]: I0308 00:31:58.764282 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") pod \"4a829558-a672-4dc5-ae20-69884213482f\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " Mar 08 00:31:58.764811 master-0 kubenswrapper[23041]: I0308 00:31:58.764583 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") pod \"4a829558-a672-4dc5-ae20-69884213482f\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " Mar 08 00:31:58.764811 master-0 kubenswrapper[23041]: I0308 00:31:58.764649 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") pod \"4a829558-a672-4dc5-ae20-69884213482f\" (UID: \"4a829558-a672-4dc5-ae20-69884213482f\") " Mar 08 00:31:58.765604 master-0 kubenswrapper[23041]: I0308 00:31:58.764969 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock" (OuterVolumeSpecName: "var-lock") pod "4a829558-a672-4dc5-ae20-69884213482f" (UID: "4a829558-a672-4dc5-ae20-69884213482f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:58.765604 master-0 kubenswrapper[23041]: I0308 00:31:58.765028 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4a829558-a672-4dc5-ae20-69884213482f" (UID: "4a829558-a672-4dc5-ae20-69884213482f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:58.770456 master-0 kubenswrapper[23041]: I0308 00:31:58.770323 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4a829558-a672-4dc5-ae20-69884213482f" (UID: "4a829558-a672-4dc5-ae20-69884213482f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:58.774769 master-0 kubenswrapper[23041]: I0308 00:31:58.773714 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=31.773697919 podStartE2EDuration="31.773697919s" podCreationTimestamp="2026-03-08 00:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:58.771529916 +0000 UTC m=+24.244366490" watchObservedRunningTime="2026-03-08 00:31:58.773697919 +0000 UTC m=+24.246534473" Mar 08 00:31:58.826047 master-0 kubenswrapper[23041]: I0308 00:31:58.825951 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66915251-1fdd-40f3-a59b-054776b214df" path="/var/lib/kubelet/pods/66915251-1fdd-40f3-a59b-054776b214df/volumes" Mar 08 00:31:58.867253 master-0 kubenswrapper[23041]: I0308 00:31:58.867180 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:58.867253 master-0 kubenswrapper[23041]: I0308 00:31:58.867247 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4a829558-a672-4dc5-ae20-69884213482f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:58.867253 master-0 kubenswrapper[23041]: I0308 00:31:58.867266 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a829558-a672-4dc5-ae20-69884213482f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:31:58.887735 master-0 kubenswrapper[23041]: I0308 00:31:58.887668 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=7.887642941 podStartE2EDuration="7.887642941s" podCreationTimestamp="2026-03-08 00:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:58.886963624 +0000 UTC m=+24.359800188" watchObservedRunningTime="2026-03-08 00:31:58.887642941 +0000 UTC m=+24.360479495" Mar 08 00:31:59.339980 master-0 kubenswrapper[23041]: I0308 00:31:59.339913 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_4a829558-a672-4dc5-ae20-69884213482f/installer/0.log" Mar 08 00:31:59.340329 master-0 kubenswrapper[23041]: I0308 00:31:59.340112 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 00:31:59.340707 master-0 kubenswrapper[23041]: I0308 00:31:59.340391 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"4a829558-a672-4dc5-ae20-69884213482f","Type":"ContainerDied","Data":"388b509d4fc31b4d0508a9d9464942cef558c545f646f2395c6df6984fdeb45b"} Mar 08 00:31:59.340707 master-0 kubenswrapper[23041]: I0308 00:31:59.340472 23041 scope.go:117] "RemoveContainer" containerID="75e221d268f8334bee9d063ac79605ca72f10402851cefdf7624001eae8cbb17" Mar 08 00:31:59.362251 master-0 kubenswrapper[23041]: I0308 00:31:59.342692 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" event={"ID":"2040e5dc-b314-46a9-a61b-e80f1a046ce3","Type":"ContainerStarted","Data":"4058c62444d759380ac633ae1b7a9d11b49919ab9e7ff9b2ba99895baec71de0"} Mar 08 00:31:59.362251 master-0 kubenswrapper[23041]: I0308 00:31:59.355805 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:59.362251 master-0 kubenswrapper[23041]: I0308 00:31:59.357546 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" Mar 08 00:31:59.362251 master-0 kubenswrapper[23041]: I0308 00:31:59.357567 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"48fcdccb-478e-4027-b4b9-9a061439f0e2","Type":"ContainerStarted","Data":"0cbd1fb0e210283b6ccd115d8b0aae824719f69eb7b32ccc55a017669c4605fa"} Mar 08 00:31:59.362251 master-0 kubenswrapper[23041]: I0308 00:31:59.358755 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"e996210f21346953b3c04588461f0b36b127888fe46d29566867f66f8d523ba4"} Mar 08 00:31:59.379972 master-0 kubenswrapper[23041]: I0308 00:31:59.379569 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" podStartSLOduration=14.379537549 podStartE2EDuration="14.379537549s" podCreationTimestamp="2026-03-08 00:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:59.371636168 +0000 UTC m=+24.844472742" watchObservedRunningTime="2026-03-08 00:31:59.379537549 +0000 UTC m=+24.852374103" Mar 08 00:31:59.414964 master-0 kubenswrapper[23041]: I0308 00:31:59.414479 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5qffz" podStartSLOduration=17.414453999 podStartE2EDuration="17.414453999s" podCreationTimestamp="2026-03-08 00:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:59.411002046 +0000 UTC m=+24.883838600" watchObservedRunningTime="2026-03-08 00:31:59.414453999 +0000 UTC m=+24.887290553" Mar 08 00:31:59.585789 master-0 kubenswrapper[23041]: I0308 00:31:59.585689 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6db79546f6-gdz4k" podStartSLOduration=7.932168382 podStartE2EDuration="10.585664439s" podCreationTimestamp="2026-03-08 00:31:49 +0000 UTC" firstStartedPulling="2026-03-08 00:31:55.640378964 +0000 UTC m=+21.113215518" lastFinishedPulling="2026-03-08 00:31:58.293875031 +0000 UTC m=+23.766711575" observedRunningTime="2026-03-08 00:31:59.585259299 +0000 UTC m=+25.058095853" watchObservedRunningTime="2026-03-08 00:31:59.585664439 +0000 UTC m=+25.058500993" Mar 08 00:31:59.633293 master-0 kubenswrapper[23041]: I0308 00:31:59.632435 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=3.6324088740000002 podStartE2EDuration="3.632408874s" podCreationTimestamp="2026-03-08 00:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:59.627592298 +0000 UTC m=+25.100428852" watchObservedRunningTime="2026-03-08 00:31:59.632408874 +0000 UTC m=+25.105245428" Mar 08 00:31:59.651712 master-0 kubenswrapper[23041]: I0308 00:31:59.651656 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:59.668739 master-0 kubenswrapper[23041]: I0308 00:31:59.668673 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 00:31:59.748783 master-0 kubenswrapper[23041]: I0308 00:31:59.748727 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-74444d8fbc-g7z4w" Mar 08 00:31:59.851309 master-0 kubenswrapper[23041]: I0308 00:31:59.849724 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:31:59.851309 master-0 kubenswrapper[23041]: E0308 00:31:59.849991 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a829558-a672-4dc5-ae20-69884213482f" containerName="installer" Mar 08 00:31:59.851309 master-0 kubenswrapper[23041]: I0308 00:31:59.850004 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a829558-a672-4dc5-ae20-69884213482f" containerName="installer" Mar 08 00:31:59.851309 master-0 kubenswrapper[23041]: I0308 00:31:59.850165 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a829558-a672-4dc5-ae20-69884213482f" containerName="installer" Mar 08 00:31:59.851309 master-0 kubenswrapper[23041]: I0308 00:31:59.850633 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.853590 master-0 kubenswrapper[23041]: I0308 00:31:59.853543 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-4dxb7" Mar 08 00:31:59.853825 master-0 kubenswrapper[23041]: I0308 00:31:59.853796 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:31:59.861956 master-0 kubenswrapper[23041]: I0308 00:31:59.861900 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:31:59.897002 master-0 kubenswrapper[23041]: I0308 00:31:59.896871 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.897002 master-0 kubenswrapper[23041]: I0308 00:31:59.896949 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.897002 master-0 kubenswrapper[23041]: I0308 00:31:59.896994 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.999019 master-0 kubenswrapper[23041]: I0308 00:31:59.998356 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.999019 master-0 kubenswrapper[23041]: I0308 00:31:59.998423 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.999019 master-0 kubenswrapper[23041]: I0308 00:31:59.998465 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.999019 master-0 kubenswrapper[23041]: I0308 00:31:59.998615 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:31:59.999019 master-0 kubenswrapper[23041]: I0308 00:31:59.998656 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:32:00.021246 master-0 kubenswrapper[23041]: I0308 00:32:00.021181 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:32:00.170501 master-0 kubenswrapper[23041]: I0308 00:32:00.170364 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:32:00.381162 master-0 kubenswrapper[23041]: I0308 00:32:00.381103 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-85cb8cb9bb-bmx44" Mar 08 00:32:00.823221 master-0 kubenswrapper[23041]: I0308 00:32:00.822548 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a829558-a672-4dc5-ae20-69884213482f" path="/var/lib/kubelet/pods/4a829558-a672-4dc5-ae20-69884213482f/volumes" Mar 08 00:32:01.023228 master-0 kubenswrapper[23041]: I0308 00:32:01.023146 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:32:01.023452 master-0 kubenswrapper[23041]: E0308 00:32:01.023403 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:01.023522 master-0 kubenswrapper[23041]: E0308 00:32:01.023504 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:32:09.02348131 +0000 UTC m=+34.496317864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:01.469416 master-0 kubenswrapper[23041]: I0308 00:32:01.469153 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 00:32:01.545174 master-0 kubenswrapper[23041]: I0308 00:32:01.545103 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:32:01.550797 master-0 kubenswrapper[23041]: I0308 00:32:01.550750 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:32:01.558400 master-0 kubenswrapper[23041]: I0308 00:32:01.558334 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:32:02.347826 master-0 kubenswrapper[23041]: I0308 00:32:02.347772 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-8g27w"] Mar 08 00:32:02.351812 master-0 kubenswrapper[23041]: I0308 00:32:02.349186 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:02.352010 master-0 kubenswrapper[23041]: I0308 00:32:02.351982 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-49cm5" Mar 08 00:32:02.353058 master-0 kubenswrapper[23041]: I0308 00:32:02.352216 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:32:02.353058 master-0 kubenswrapper[23041]: I0308 00:32:02.352566 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:32:02.375790 master-0 kubenswrapper[23041]: I0308 00:32:02.375739 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-8g27w"] Mar 08 00:32:02.399681 master-0 kubenswrapper[23041]: I0308 00:32:02.399618 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"0133db83-1083-4458-86d4-49e431dd4365","Type":"ContainerStarted","Data":"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493"} Mar 08 00:32:02.399681 master-0 kubenswrapper[23041]: I0308 00:32:02.399680 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"0133db83-1083-4458-86d4-49e431dd4365","Type":"ContainerStarted","Data":"d99bde344c0c744884322f50bc262e862d206379d4253c8485cb44806af03448"} Mar 08 00:32:02.401721 master-0 kubenswrapper[23041]: I0308 00:32:02.401690 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" exitCode=0 Mar 08 00:32:02.401823 master-0 kubenswrapper[23041]: I0308 00:32:02.401796 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} Mar 08 00:32:02.403746 master-0 kubenswrapper[23041]: I0308 00:32:02.403580 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" event={"ID":"b22c3046-5193-4c1d-91c0-7c15745265be","Type":"ContainerStarted","Data":"64250f1195c6c507058b16ede5d89fd886cd01b60d5254b9067e91c0ff2f6e7e"} Mar 08 00:32:02.404022 master-0 kubenswrapper[23041]: I0308 00:32:02.404001 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:32:02.410131 master-0 kubenswrapper[23041]: I0308 00:32:02.410094 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" event={"ID":"795e6115-95cc-4c0a-a407-e0a6f14118e5","Type":"ContainerStarted","Data":"83e18342bd43af53e6ece31e0938e84efd6d601562a26925dcfbe2b0fcc9941d"} Mar 08 00:32:02.410178 master-0 kubenswrapper[23041]: I0308 00:32:02.410132 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" event={"ID":"795e6115-95cc-4c0a-a407-e0a6f14118e5","Type":"ContainerStarted","Data":"bcdd411da81b3d40c83f8dc5fd86cf2b0275b921a39ad084e2bfeca31951dd69"} Mar 08 00:32:02.410178 master-0 kubenswrapper[23041]: I0308 00:32:02.410145 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" event={"ID":"795e6115-95cc-4c0a-a407-e0a6f14118e5","Type":"ContainerStarted","Data":"26c5cc81d865c2a720045488d8c27fd2ff866becd870fabc5f875ce3c289276e"} Mar 08 00:32:02.410730 master-0 kubenswrapper[23041]: I0308 00:32:02.410710 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" Mar 08 00:32:02.422835 master-0 kubenswrapper[23041]: I0308 00:32:02.422783 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=3.422768055 podStartE2EDuration="3.422768055s" podCreationTimestamp="2026-03-08 00:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:02.421611587 +0000 UTC m=+27.894448161" watchObservedRunningTime="2026-03-08 00:32:02.422768055 +0000 UTC m=+27.895604609" Mar 08 00:32:02.447193 master-0 kubenswrapper[23041]: I0308 00:32:02.447146 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qdx\" (UniqueName: \"kubernetes.io/projected/414dbe5d-16a5-4765-9dc5-d50c0784ace7-kube-api-access-n6qdx\") pod \"downloads-84f57b9877-8g27w\" (UID: \"414dbe5d-16a5-4765-9dc5-d50c0784ace7\") " pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:02.477665 master-0 kubenswrapper[23041]: I0308 00:32:02.476395 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6cfc594d97-x62fk" podStartSLOduration=9.790886068 podStartE2EDuration="14.476375205s" podCreationTimestamp="2026-03-08 00:31:48 +0000 UTC" firstStartedPulling="2026-03-08 00:31:56.613494423 +0000 UTC m=+22.086330977" lastFinishedPulling="2026-03-08 00:32:01.29898356 +0000 UTC m=+26.771820114" observedRunningTime="2026-03-08 00:32:02.46827871 +0000 UTC m=+27.941115294" watchObservedRunningTime="2026-03-08 00:32:02.476375205 +0000 UTC m=+27.949211759" Mar 08 00:32:02.493769 master-0 kubenswrapper[23041]: I0308 00:32:02.492735 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-db7d8" podStartSLOduration=13.094794361 podStartE2EDuration="17.492711928s" podCreationTimestamp="2026-03-08 00:31:45 +0000 UTC" firstStartedPulling="2026-03-08 00:31:56.921741451 +0000 UTC m=+22.394578005" lastFinishedPulling="2026-03-08 00:32:01.319659018 +0000 UTC m=+26.792495572" observedRunningTime="2026-03-08 00:32:02.492588395 +0000 UTC m=+27.965424949" watchObservedRunningTime="2026-03-08 00:32:02.492711928 +0000 UTC m=+27.965548492" Mar 08 00:32:02.549975 master-0 kubenswrapper[23041]: I0308 00:32:02.549915 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qdx\" (UniqueName: \"kubernetes.io/projected/414dbe5d-16a5-4765-9dc5-d50c0784ace7-kube-api-access-n6qdx\") pod \"downloads-84f57b9877-8g27w\" (UID: \"414dbe5d-16a5-4765-9dc5-d50c0784ace7\") " pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:02.573519 master-0 kubenswrapper[23041]: I0308 00:32:02.573399 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qdx\" (UniqueName: \"kubernetes.io/projected/414dbe5d-16a5-4765-9dc5-d50c0784ace7-kube-api-access-n6qdx\") pod \"downloads-84f57b9877-8g27w\" (UID: \"414dbe5d-16a5-4765-9dc5-d50c0784ace7\") " pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:02.670562 master-0 kubenswrapper[23041]: I0308 00:32:02.670455 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:03.189648 master-0 kubenswrapper[23041]: I0308 00:32:03.188556 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:32:03.193852 master-0 kubenswrapper[23041]: I0308 00:32:03.189882 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:32:03.226546 master-0 kubenswrapper[23041]: I0308 00:32:03.226382 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2w9mf" Mar 08 00:32:03.438819 master-0 kubenswrapper[23041]: I0308 00:32:03.435217 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"313f825452e93e653737da497e84c921f1709e4a2676cad1d8e9ac19c6324e4f"} Mar 08 00:32:03.484999 master-0 kubenswrapper[23041]: I0308 00:32:03.484953 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-8g27w"] Mar 08 00:32:03.491544 master-0 kubenswrapper[23041]: W0308 00:32:03.491512 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414dbe5d_16a5_4765_9dc5_d50c0784ace7.slice/crio-bafb2dd4f4bcb370ea80def0d7c1266dec902efe9666e795f539cea29ad9e59b WatchSource:0}: Error finding container bafb2dd4f4bcb370ea80def0d7c1266dec902efe9666e795f539cea29ad9e59b: Status 404 returned error can't find the container with id bafb2dd4f4bcb370ea80def0d7c1266dec902efe9666e795f539cea29ad9e59b Mar 08 00:32:04.447986 master-0 kubenswrapper[23041]: I0308 00:32:04.447911 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-8g27w" event={"ID":"414dbe5d-16a5-4765-9dc5-d50c0784ace7","Type":"ContainerStarted","Data":"bafb2dd4f4bcb370ea80def0d7c1266dec902efe9666e795f539cea29ad9e59b"} Mar 08 00:32:04.452616 master-0 kubenswrapper[23041]: I0308 00:32:04.452574 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"a794953b2d288655aa6f3493820344f0e9d563e0cfa6c35d91bb163b65a576f9"} Mar 08 00:32:04.452684 master-0 kubenswrapper[23041]: I0308 00:32:04.452628 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"bca932cb700c8ff7e54f8ac4dbdd257bddcb21c42ac0c0b6478d915eb27acb7a"} Mar 08 00:32:05.042571 master-0 kubenswrapper[23041]: I0308 00:32:05.042514 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:05.049700 master-0 kubenswrapper[23041]: I0308 00:32:05.043417 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.056319 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.056573 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.056796 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.057260 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.057500 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.057660 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.057866 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:32:05.058018 master-0 kubenswrapper[23041]: I0308 00:32:05.057969 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:32:05.071492 master-0 kubenswrapper[23041]: I0308 00:32:05.059304 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-7gfkc" Mar 08 00:32:05.071492 master-0 kubenswrapper[23041]: I0308 00:32:05.059670 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:05.080619 master-0 kubenswrapper[23041]: I0308 00:32:05.078970 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:32:05.118342 master-0 kubenswrapper[23041]: I0308 00:32:05.114156 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:32:05.118342 master-0 kubenswrapper[23041]: I0308 00:32:05.114420 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:32:05.118342 master-0 kubenswrapper[23041]: I0308 00:32:05.114533 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:32:05.118342 master-0 kubenswrapper[23041]: I0308 00:32:05.117673 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143024 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143070 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143094 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143116 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143134 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143165 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143194 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143230 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143260 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143284 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143301 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143331 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.144758 master-0 kubenswrapper[23041]: I0308 00:32:05.143350 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dmg\" (UniqueName: \"kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.207546 master-0 kubenswrapper[23041]: I0308 00:32:05.207311 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9j9zs" Mar 08 00:32:05.244377 master-0 kubenswrapper[23041]: I0308 00:32:05.244312 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244448 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244471 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244503 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244526 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244546 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244585 master-0 kubenswrapper[23041]: I0308 00:32:05.244580 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.244928 master-0 kubenswrapper[23041]: I0308 00:32:05.244602 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dmg\" (UniqueName: \"kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.245309 master-0 kubenswrapper[23041]: I0308 00:32:05.245275 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.245356 master-0 kubenswrapper[23041]: I0308 00:32:05.245303 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.245426 master-0 kubenswrapper[23041]: I0308 00:32:05.245356 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.245426 master-0 kubenswrapper[23041]: I0308 00:32:05.245389 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.246048 master-0 kubenswrapper[23041]: I0308 00:32:05.245424 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.246048 master-0 kubenswrapper[23041]: I0308 00:32:05.245515 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.247942 master-0 kubenswrapper[23041]: I0308 00:32:05.247885 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.248739 master-0 kubenswrapper[23041]: I0308 00:32:05.248501 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.248913 master-0 kubenswrapper[23041]: I0308 00:32:05.248751 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.250422 master-0 kubenswrapper[23041]: I0308 00:32:05.249954 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.250422 master-0 kubenswrapper[23041]: I0308 00:32:05.250235 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.250422 master-0 kubenswrapper[23041]: I0308 00:32:05.250337 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.250650 master-0 kubenswrapper[23041]: I0308 00:32:05.250544 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.250783 master-0 kubenswrapper[23041]: I0308 00:32:05.250756 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.253780 master-0 kubenswrapper[23041]: I0308 00:32:05.253739 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.255598 master-0 kubenswrapper[23041]: I0308 00:32:05.254228 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.255598 master-0 kubenswrapper[23041]: I0308 00:32:05.255118 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.266460 master-0 kubenswrapper[23041]: I0308 00:32:05.266392 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dmg\" (UniqueName: \"kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg\") pod \"oauth-openshift-6df5fc69d-thc6n\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.454269 master-0 kubenswrapper[23041]: I0308 00:32:05.453801 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:32:05.456858 master-0 kubenswrapper[23041]: I0308 00:32:05.456805 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.478154 master-0 kubenswrapper[23041]: I0308 00:32:05.478003 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:32:05.478531 master-0 kubenswrapper[23041]: I0308 00:32:05.478245 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:32:05.478531 master-0 kubenswrapper[23041]: I0308 00:32:05.478484 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-kbbf9" Mar 08 00:32:05.481345 master-0 kubenswrapper[23041]: I0308 00:32:05.478485 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:32:05.481345 master-0 kubenswrapper[23041]: I0308 00:32:05.478481 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:32:05.481767 master-0 kubenswrapper[23041]: I0308 00:32:05.481648 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:32:05.482122 master-0 kubenswrapper[23041]: I0308 00:32:05.482075 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:32:05.482642 master-0 kubenswrapper[23041]: I0308 00:32:05.482604 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:05.485792 master-0 kubenswrapper[23041]: I0308 00:32:05.485684 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} Mar 08 00:32:05.485792 master-0 kubenswrapper[23041]: I0308 00:32:05.485724 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} Mar 08 00:32:05.552466 master-0 kubenswrapper[23041]: I0308 00:32:05.552403 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.552699 master-0 kubenswrapper[23041]: I0308 00:32:05.552592 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.552749 master-0 kubenswrapper[23041]: I0308 00:32:05.552711 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.552945 master-0 kubenswrapper[23041]: I0308 00:32:05.552900 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.553026 master-0 kubenswrapper[23041]: I0308 00:32:05.553002 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.553078 master-0 kubenswrapper[23041]: I0308 00:32:05.553030 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk28b\" (UniqueName: \"kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.654641 master-0 kubenswrapper[23041]: I0308 00:32:05.654600 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.654868 master-0 kubenswrapper[23041]: I0308 00:32:05.654658 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.654868 master-0 kubenswrapper[23041]: I0308 00:32:05.654841 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.654941 master-0 kubenswrapper[23041]: I0308 00:32:05.654883 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.654941 master-0 kubenswrapper[23041]: I0308 00:32:05.654924 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.655007 master-0 kubenswrapper[23041]: I0308 00:32:05.654948 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk28b\" (UniqueName: \"kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.656383 master-0 kubenswrapper[23041]: I0308 00:32:05.656314 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.656704 master-0 kubenswrapper[23041]: I0308 00:32:05.656652 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.658431 master-0 kubenswrapper[23041]: I0308 00:32:05.658397 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.658503 master-0 kubenswrapper[23041]: I0308 00:32:05.658434 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.660754 master-0 kubenswrapper[23041]: I0308 00:32:05.660709 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.674542 master-0 kubenswrapper[23041]: I0308 00:32:05.674482 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk28b\" (UniqueName: \"kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b\") pod \"console-5c84b9c874-8xl2l\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:05.694418 master-0 kubenswrapper[23041]: I0308 00:32:05.694276 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6t5lg" Mar 08 00:32:05.802248 master-0 kubenswrapper[23041]: I0308 00:32:05.802012 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:06.004059 master-0 kubenswrapper[23041]: I0308 00:32:06.003987 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4fjw9" Mar 08 00:32:06.020247 master-0 kubenswrapper[23041]: I0308 00:32:06.017763 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9nqqp" Mar 08 00:32:06.078407 master-0 kubenswrapper[23041]: I0308 00:32:06.077335 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:06.267095 master-0 kubenswrapper[23041]: I0308 00:32:06.267013 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:32:06.280361 master-0 kubenswrapper[23041]: W0308 00:32:06.280321 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3baca04d_be92_4f02_8ea9_94cc37fc00b4.slice/crio-7dedc3a693168997bbdf5b41d54540ad18fe948792d8dc4a9093bd38445a352b WatchSource:0}: Error finding container 7dedc3a693168997bbdf5b41d54540ad18fe948792d8dc4a9093bd38445a352b: Status 404 returned error can't find the container with id 7dedc3a693168997bbdf5b41d54540ad18fe948792d8dc4a9093bd38445a352b Mar 08 00:32:06.500919 master-0 kubenswrapper[23041]: I0308 00:32:06.500837 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"a0874f8819241e99e50857b38e4607e2013b57b718af4a91644289a81df9cff6"} Mar 08 00:32:06.500919 master-0 kubenswrapper[23041]: I0308 00:32:06.500922 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"38c0f3b220d54c018e60a11fcc264ccc73cb3f6f3afd0179356ad30307c5a971"} Mar 08 00:32:06.500919 master-0 kubenswrapper[23041]: I0308 00:32:06.500946 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" event={"ID":"1aec9660-eaf0-48c1-8d83-1a89982f9804","Type":"ContainerStarted","Data":"ddf66973c5e9c4c1a3f55c4aa399bc393bb99d43c0ea23781a44a592c099c6ce"} Mar 08 00:32:06.502146 master-0 kubenswrapper[23041]: I0308 00:32:06.501099 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:32:06.508026 master-0 kubenswrapper[23041]: I0308 00:32:06.507616 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} Mar 08 00:32:06.508026 master-0 kubenswrapper[23041]: I0308 00:32:06.507810 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} Mar 08 00:32:06.508026 master-0 kubenswrapper[23041]: I0308 00:32:06.507870 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} Mar 08 00:32:06.508026 master-0 kubenswrapper[23041]: I0308 00:32:06.507903 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerStarted","Data":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} Mar 08 00:32:06.509192 master-0 kubenswrapper[23041]: I0308 00:32:06.509134 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c84b9c874-8xl2l" event={"ID":"3baca04d-be92-4f02-8ea9-94cc37fc00b4","Type":"ContainerStarted","Data":"7dedc3a693168997bbdf5b41d54540ad18fe948792d8dc4a9093bd38445a352b"} Mar 08 00:32:06.510137 master-0 kubenswrapper[23041]: I0308 00:32:06.510110 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" event={"ID":"1f437239-11ff-4333-bdea-8a48b8ac95e8","Type":"ContainerStarted","Data":"cbc291543364b486a08967ccc4804b1fae56cde3e2f919af998f09716f2e7340"} Mar 08 00:32:06.535931 master-0 kubenswrapper[23041]: I0308 00:32:06.535693 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" podStartSLOduration=2.509886536 podStartE2EDuration="9.535670964s" podCreationTimestamp="2026-03-08 00:31:57 +0000 UTC" firstStartedPulling="2026-03-08 00:31:58.672264968 +0000 UTC m=+24.145101522" lastFinishedPulling="2026-03-08 00:32:05.698049396 +0000 UTC m=+31.170885950" observedRunningTime="2026-03-08 00:32:06.533491751 +0000 UTC m=+32.006328365" watchObservedRunningTime="2026-03-08 00:32:06.535670964 +0000 UTC m=+32.008507518" Mar 08 00:32:06.575789 master-0 kubenswrapper[23041]: I0308 00:32:06.575630 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.940824251 podStartE2EDuration="10.575599065s" podCreationTimestamp="2026-03-08 00:31:56 +0000 UTC" firstStartedPulling="2026-03-08 00:31:57.271467066 +0000 UTC m=+22.744303620" lastFinishedPulling="2026-03-08 00:32:04.90624188 +0000 UTC m=+30.379078434" observedRunningTime="2026-03-08 00:32:06.575169974 +0000 UTC m=+32.048006538" watchObservedRunningTime="2026-03-08 00:32:06.575599065 +0000 UTC m=+32.048435619" Mar 08 00:32:08.502476 master-0 kubenswrapper[23041]: I0308 00:32:08.502433 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:32:08.503171 master-0 kubenswrapper[23041]: I0308 00:32:08.502702 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://fa423e54fafba3982d7bb2d5466fcee2c23cbdcb2db478a9c800bb36094dd0d1" gracePeriod=5 Mar 08 00:32:09.034311 master-0 kubenswrapper[23041]: I0308 00:32:09.034186 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:32:09.034528 master-0 kubenswrapper[23041]: E0308 00:32:09.034415 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:09.034528 master-0 kubenswrapper[23041]: E0308 00:32:09.034504 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:32:25.034478018 +0000 UTC m=+50.507314572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:09.547301 master-0 kubenswrapper[23041]: I0308 00:32:09.547179 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" event={"ID":"1f437239-11ff-4333-bdea-8a48b8ac95e8","Type":"ContainerStarted","Data":"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82"} Mar 08 00:32:09.548663 master-0 kubenswrapper[23041]: I0308 00:32:09.548632 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:09.556328 master-0 kubenswrapper[23041]: I0308 00:32:09.556297 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:09.605748 master-0 kubenswrapper[23041]: I0308 00:32:09.605669 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" podStartSLOduration=3.379798438 podStartE2EDuration="5.605609693s" podCreationTimestamp="2026-03-08 00:32:04 +0000 UTC" firstStartedPulling="2026-03-08 00:32:06.091163616 +0000 UTC m=+31.564000170" lastFinishedPulling="2026-03-08 00:32:08.316974881 +0000 UTC m=+33.789811425" observedRunningTime="2026-03-08 00:32:09.574277249 +0000 UTC m=+35.047113863" watchObservedRunningTime="2026-03-08 00:32:09.605609693 +0000 UTC m=+35.078446247" Mar 08 00:32:10.121718 master-0 kubenswrapper[23041]: I0308 00:32:10.117586 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:32:10.121718 master-0 kubenswrapper[23041]: E0308 00:32:10.117953 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 00:32:10.121718 master-0 kubenswrapper[23041]: I0308 00:32:10.117966 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 00:32:10.121718 master-0 kubenswrapper[23041]: I0308 00:32:10.118149 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 00:32:10.122174 master-0 kubenswrapper[23041]: I0308 00:32:10.121813 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.127186 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.127434 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-68jqh" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.127515 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.127635 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-48mqvdnajl6js" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.127694 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.128154 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 00:32:10.128877 master-0 kubenswrapper[23041]: I0308 00:32:10.128155 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 00:32:10.135488 master-0 kubenswrapper[23041]: I0308 00:32:10.135431 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 00:32:10.136036 master-0 kubenswrapper[23041]: I0308 00:32:10.135740 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 00:32:10.136296 master-0 kubenswrapper[23041]: I0308 00:32:10.136263 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 00:32:10.136602 master-0 kubenswrapper[23041]: I0308 00:32:10.136540 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 00:32:10.143899 master-0 kubenswrapper[23041]: I0308 00:32:10.143868 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 00:32:10.152397 master-0 kubenswrapper[23041]: I0308 00:32:10.152253 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 00:32:10.163462 master-0 kubenswrapper[23041]: I0308 00:32:10.162523 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:32:10.265455 master-0 kubenswrapper[23041]: I0308 00:32:10.265381 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.265833 master-0 kubenswrapper[23041]: I0308 00:32:10.265818 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.265961 master-0 kubenswrapper[23041]: I0308 00:32:10.265945 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266091 master-0 kubenswrapper[23041]: I0308 00:32:10.266076 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266171 master-0 kubenswrapper[23041]: I0308 00:32:10.266159 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266348 master-0 kubenswrapper[23041]: I0308 00:32:10.266322 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266513 master-0 kubenswrapper[23041]: I0308 00:32:10.266493 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266645 master-0 kubenswrapper[23041]: I0308 00:32:10.266632 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl98m\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266725 master-0 kubenswrapper[23041]: I0308 00:32:10.266711 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.266835 master-0 kubenswrapper[23041]: I0308 00:32:10.266821 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267123 master-0 kubenswrapper[23041]: I0308 00:32:10.267107 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267224 master-0 kubenswrapper[23041]: I0308 00:32:10.267194 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267339 master-0 kubenswrapper[23041]: I0308 00:32:10.267312 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267479 master-0 kubenswrapper[23041]: I0308 00:32:10.267465 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267663 master-0 kubenswrapper[23041]: I0308 00:32:10.267644 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.267814 master-0 kubenswrapper[23041]: I0308 00:32:10.267798 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.268011 master-0 kubenswrapper[23041]: I0308 00:32:10.267919 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.268169 master-0 kubenswrapper[23041]: I0308 00:32:10.268126 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.371248 master-0 kubenswrapper[23041]: I0308 00:32:10.370236 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.371598 master-0 kubenswrapper[23041]: I0308 00:32:10.371504 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.371598 master-0 kubenswrapper[23041]: I0308 00:32:10.371550 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.371598 master-0 kubenswrapper[23041]: I0308 00:32:10.371587 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.372667 master-0 kubenswrapper[23041]: I0308 00:32:10.372234 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.372667 master-0 kubenswrapper[23041]: I0308 00:32:10.372358 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.372667 master-0 kubenswrapper[23041]: I0308 00:32:10.372404 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.374460 master-0 kubenswrapper[23041]: I0308 00:32:10.372889 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.374460 master-0 kubenswrapper[23041]: I0308 00:32:10.373460 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.374800 master-0 kubenswrapper[23041]: I0308 00:32:10.374751 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.376190 master-0 kubenswrapper[23041]: I0308 00:32:10.376041 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.376807 master-0 kubenswrapper[23041]: I0308 00:32:10.376647 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.377512 master-0 kubenswrapper[23041]: I0308 00:32:10.377396 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.379971 master-0 kubenswrapper[23041]: I0308 00:32:10.379924 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl98m\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380516 master-0 kubenswrapper[23041]: I0308 00:32:10.380439 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380595 master-0 kubenswrapper[23041]: I0308 00:32:10.380560 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380595 master-0 kubenswrapper[23041]: I0308 00:32:10.380586 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380788 master-0 kubenswrapper[23041]: I0308 00:32:10.380638 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380788 master-0 kubenswrapper[23041]: I0308 00:32:10.380676 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380788 master-0 kubenswrapper[23041]: I0308 00:32:10.380733 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380788 master-0 kubenswrapper[23041]: I0308 00:32:10.380769 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.380911 master-0 kubenswrapper[23041]: I0308 00:32:10.380793 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.386327 master-0 kubenswrapper[23041]: I0308 00:32:10.381551 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.386327 master-0 kubenswrapper[23041]: I0308 00:32:10.382222 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.386327 master-0 kubenswrapper[23041]: I0308 00:32:10.384762 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.386327 master-0 kubenswrapper[23041]: I0308 00:32:10.384815 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.386327 master-0 kubenswrapper[23041]: I0308 00:32:10.384865 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.387356 master-0 kubenswrapper[23041]: I0308 00:32:10.387306 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.387842 master-0 kubenswrapper[23041]: I0308 00:32:10.387779 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.390698 master-0 kubenswrapper[23041]: I0308 00:32:10.390658 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.392510 master-0 kubenswrapper[23041]: I0308 00:32:10.392471 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.393381 master-0 kubenswrapper[23041]: I0308 00:32:10.393325 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.393529 master-0 kubenswrapper[23041]: I0308 00:32:10.393491 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.394414 master-0 kubenswrapper[23041]: I0308 00:32:10.394371 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.394778 master-0 kubenswrapper[23041]: I0308 00:32:10.394734 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.396164 master-0 kubenswrapper[23041]: I0308 00:32:10.396122 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl98m\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m\") pod \"prometheus-k8s-0\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:10.465669 master-0 kubenswrapper[23041]: I0308 00:32:10.465597 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:12.194252 master-0 kubenswrapper[23041]: I0308 00:32:12.193731 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:32:12.194861 master-0 kubenswrapper[23041]: W0308 00:32:12.194817 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b9b4180_fc41_4072_9c61_0a35390a7ff3.slice/crio-39df2361ce85bd74018d4b51eb5800f820b46418602e00b8ff5ae478baf2f2d2 WatchSource:0}: Error finding container 39df2361ce85bd74018d4b51eb5800f820b46418602e00b8ff5ae478baf2f2d2: Status 404 returned error can't find the container with id 39df2361ce85bd74018d4b51eb5800f820b46418602e00b8ff5ae478baf2f2d2 Mar 08 00:32:12.589395 master-0 kubenswrapper[23041]: I0308 00:32:12.589317 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c84b9c874-8xl2l" event={"ID":"3baca04d-be92-4f02-8ea9-94cc37fc00b4","Type":"ContainerStarted","Data":"0a7b7a4bad54508072eb15ff1d869aaa4adb172a39c8eccea3f65180f4f8b0c7"} Mar 08 00:32:12.592026 master-0 kubenswrapper[23041]: I0308 00:32:12.591975 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" exitCode=0 Mar 08 00:32:12.592026 master-0 kubenswrapper[23041]: I0308 00:32:12.592022 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} Mar 08 00:32:12.592026 master-0 kubenswrapper[23041]: I0308 00:32:12.592041 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"39df2361ce85bd74018d4b51eb5800f820b46418602e00b8ff5ae478baf2f2d2"} Mar 08 00:32:12.625471 master-0 kubenswrapper[23041]: I0308 00:32:12.625356 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c84b9c874-8xl2l" podStartSLOduration=2.308101794 podStartE2EDuration="7.625327294s" podCreationTimestamp="2026-03-08 00:32:05 +0000 UTC" firstStartedPulling="2026-03-08 00:32:06.283936556 +0000 UTC m=+31.756773110" lastFinishedPulling="2026-03-08 00:32:11.601162056 +0000 UTC m=+37.073998610" observedRunningTime="2026-03-08 00:32:12.61935853 +0000 UTC m=+38.092195114" watchObservedRunningTime="2026-03-08 00:32:12.625327294 +0000 UTC m=+38.098163848" Mar 08 00:32:12.892870 master-0 kubenswrapper[23041]: I0308 00:32:12.892675 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5cd89459d5-wwnjs" Mar 08 00:32:13.604652 master-0 kubenswrapper[23041]: I0308 00:32:13.604578 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 00:32:13.604652 master-0 kubenswrapper[23041]: I0308 00:32:13.604643 23041 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="fa423e54fafba3982d7bb2d5466fcee2c23cbdcb2db478a9c800bb36094dd0d1" exitCode=137 Mar 08 00:32:14.096660 master-0 kubenswrapper[23041]: I0308 00:32:14.096604 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 00:32:14.096938 master-0 kubenswrapper[23041]: I0308 00:32:14.096697 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:32:14.264898 master-0 kubenswrapper[23041]: I0308 00:32:14.264791 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 00:32:14.264898 master-0 kubenswrapper[23041]: I0308 00:32:14.264877 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 00:32:14.265274 master-0 kubenswrapper[23041]: I0308 00:32:14.264995 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 00:32:14.265274 master-0 kubenswrapper[23041]: I0308 00:32:14.265053 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 00:32:14.265274 master-0 kubenswrapper[23041]: I0308 00:32:14.265160 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265471 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265545 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265633 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265670 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265768 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265784 23041 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265796 23041 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:14.266063 master-0 kubenswrapper[23041]: I0308 00:32:14.265807 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:14.276737 master-0 kubenswrapper[23041]: I0308 00:32:14.276672 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:14.367725 master-0 kubenswrapper[23041]: I0308 00:32:14.367673 23041 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:14.473328 master-0 kubenswrapper[23041]: I0308 00:32:14.470963 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:32:14.473328 master-0 kubenswrapper[23041]: I0308 00:32:14.471341 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" containerID="cri-o://a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255" gracePeriod=30 Mar 08 00:32:14.492732 master-0 kubenswrapper[23041]: I0308 00:32:14.492633 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:32:14.493074 master-0 kubenswrapper[23041]: I0308 00:32:14.492881 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" podUID="70892c23-554d-466c-a526-90a799439fe0" containerName="route-controller-manager" containerID="cri-o://ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd" gracePeriod=30 Mar 08 00:32:14.614552 master-0 kubenswrapper[23041]: I0308 00:32:14.614493 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 00:32:14.616025 master-0 kubenswrapper[23041]: I0308 00:32:14.614624 23041 scope.go:117] "RemoveContainer" containerID="fa423e54fafba3982d7bb2d5466fcee2c23cbdcb2db478a9c800bb36094dd0d1" Mar 08 00:32:14.616025 master-0 kubenswrapper[23041]: I0308 00:32:14.614915 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:32:14.622496 master-0 kubenswrapper[23041]: I0308 00:32:14.621916 23041 generic.go:334] "Generic (PLEG): container finished" podID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerID="a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255" exitCode=0 Mar 08 00:32:14.622496 master-0 kubenswrapper[23041]: I0308 00:32:14.622003 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerDied","Data":"a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255"} Mar 08 00:32:14.672403 master-0 kubenswrapper[23041]: I0308 00:32:14.672187 23041 scope.go:117] "RemoveContainer" containerID="92c985a5a70112d59265249efbf6fce7869432625027fbf9a567a14e08ff9807" Mar 08 00:32:14.684393 master-0 kubenswrapper[23041]: I0308 00:32:14.684325 23041 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="bce57c31-dc1d-4def-9d9b-f8f73dee97f1" Mar 08 00:32:14.822176 master-0 kubenswrapper[23041]: I0308 00:32:14.822021 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 08 00:32:14.822474 master-0 kubenswrapper[23041]: I0308 00:32:14.822364 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 08 00:32:14.850165 master-0 kubenswrapper[23041]: I0308 00:32:14.850088 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:32:14.850593 master-0 kubenswrapper[23041]: I0308 00:32:14.850568 23041 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="bce57c31-dc1d-4def-9d9b-f8f73dee97f1" Mar 08 00:32:14.855470 master-0 kubenswrapper[23041]: I0308 00:32:14.855388 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:32:14.857404 master-0 kubenswrapper[23041]: I0308 00:32:14.857344 23041 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="bce57c31-dc1d-4def-9d9b-f8f73dee97f1" Mar 08 00:32:15.163593 master-0 kubenswrapper[23041]: I0308 00:32:15.163522 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:32:15.234085 master-0 kubenswrapper[23041]: I0308 00:32:15.234026 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:32:15.282567 master-0 kubenswrapper[23041]: I0308 00:32:15.282497 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") pod \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " Mar 08 00:32:15.282833 master-0 kubenswrapper[23041]: I0308 00:32:15.282794 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cbcb0196-be5c-44a4-9749-5df9fbeaa718" (UID: "cbcb0196-be5c-44a4-9749-5df9fbeaa718"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:15.283417 master-0 kubenswrapper[23041]: I0308 00:32:15.283386 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config" (OuterVolumeSpecName: "config") pod "cbcb0196-be5c-44a4-9749-5df9fbeaa718" (UID: "cbcb0196-be5c-44a4-9749-5df9fbeaa718"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:15.288329 master-0 kubenswrapper[23041]: I0308 00:32:15.282672 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") pod \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " Mar 08 00:32:15.288407 master-0 kubenswrapper[23041]: I0308 00:32:15.288361 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") pod \"70892c23-554d-466c-a526-90a799439fe0\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " Mar 08 00:32:15.288517 master-0 kubenswrapper[23041]: I0308 00:32:15.288491 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") pod \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " Mar 08 00:32:15.288579 master-0 kubenswrapper[23041]: I0308 00:32:15.288562 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") pod \"70892c23-554d-466c-a526-90a799439fe0\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " Mar 08 00:32:15.288641 master-0 kubenswrapper[23041]: I0308 00:32:15.288626 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") pod \"70892c23-554d-466c-a526-90a799439fe0\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289031 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") pod \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289053 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca" (OuterVolumeSpecName: "client-ca") pod "70892c23-554d-466c-a526-90a799439fe0" (UID: "70892c23-554d-466c-a526-90a799439fe0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289073 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") pod \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\" (UID: \"cbcb0196-be5c-44a4-9749-5df9fbeaa718\") " Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289155 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca" (OuterVolumeSpecName: "client-ca") pod "cbcb0196-be5c-44a4-9749-5df9fbeaa718" (UID: "cbcb0196-be5c-44a4-9749-5df9fbeaa718"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289723 23041 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289739 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289748 23041 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcb0196-be5c-44a4-9749-5df9fbeaa718-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289758 23041 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.289865 master-0 kubenswrapper[23041]: I0308 00:32:15.289817 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config" (OuterVolumeSpecName: "config") pod "70892c23-554d-466c-a526-90a799439fe0" (UID: "70892c23-554d-466c-a526-90a799439fe0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:15.291473 master-0 kubenswrapper[23041]: I0308 00:32:15.291402 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "70892c23-554d-466c-a526-90a799439fe0" (UID: "70892c23-554d-466c-a526-90a799439fe0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:15.292037 master-0 kubenswrapper[23041]: I0308 00:32:15.292011 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cbcb0196-be5c-44a4-9749-5df9fbeaa718" (UID: "cbcb0196-be5c-44a4-9749-5df9fbeaa718"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:15.295718 master-0 kubenswrapper[23041]: I0308 00:32:15.295164 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np" (OuterVolumeSpecName: "kube-api-access-4t8np") pod "cbcb0196-be5c-44a4-9749-5df9fbeaa718" (UID: "cbcb0196-be5c-44a4-9749-5df9fbeaa718"). InnerVolumeSpecName "kube-api-access-4t8np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:15.392117 master-0 kubenswrapper[23041]: I0308 00:32:15.391990 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") pod \"70892c23-554d-466c-a526-90a799439fe0\" (UID: \"70892c23-554d-466c-a526-90a799439fe0\") " Mar 08 00:32:15.392524 master-0 kubenswrapper[23041]: I0308 00:32:15.392501 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70892c23-554d-466c-a526-90a799439fe0-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.392524 master-0 kubenswrapper[23041]: I0308 00:32:15.392519 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t8np\" (UniqueName: \"kubernetes.io/projected/cbcb0196-be5c-44a4-9749-5df9fbeaa718-kube-api-access-4t8np\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.392603 master-0 kubenswrapper[23041]: I0308 00:32:15.392532 23041 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcb0196-be5c-44a4-9749-5df9fbeaa718-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.392603 master-0 kubenswrapper[23041]: I0308 00:32:15.392544 23041 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70892c23-554d-466c-a526-90a799439fe0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.396216 master-0 kubenswrapper[23041]: I0308 00:32:15.396173 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7" (OuterVolumeSpecName: "kube-api-access-kqjt7") pod "70892c23-554d-466c-a526-90a799439fe0" (UID: "70892c23-554d-466c-a526-90a799439fe0"). InnerVolumeSpecName "kube-api-access-kqjt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:15.494013 master-0 kubenswrapper[23041]: I0308 00:32:15.493950 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqjt7\" (UniqueName: \"kubernetes.io/projected/70892c23-554d-466c-a526-90a799439fe0-kube-api-access-kqjt7\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:15.634733 master-0 kubenswrapper[23041]: I0308 00:32:15.634633 23041 generic.go:334] "Generic (PLEG): container finished" podID="70892c23-554d-466c-a526-90a799439fe0" containerID="ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd" exitCode=0 Mar 08 00:32:15.635507 master-0 kubenswrapper[23041]: I0308 00:32:15.634735 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" Mar 08 00:32:15.635507 master-0 kubenswrapper[23041]: I0308 00:32:15.634746 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerDied","Data":"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd"} Mar 08 00:32:15.635507 master-0 kubenswrapper[23041]: I0308 00:32:15.634844 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh" event={"ID":"70892c23-554d-466c-a526-90a799439fe0","Type":"ContainerDied","Data":"1647ce1acf481d17be37f6cfd515be4f74eaddbda6620f025db77860f5acbd00"} Mar 08 00:32:15.635507 master-0 kubenswrapper[23041]: I0308 00:32:15.634877 23041 scope.go:117] "RemoveContainer" containerID="ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd" Mar 08 00:32:15.642831 master-0 kubenswrapper[23041]: I0308 00:32:15.642741 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" event={"ID":"cbcb0196-be5c-44a4-9749-5df9fbeaa718","Type":"ContainerDied","Data":"ddbc9d4d3c5ffe04f1f188d461103a088e60e8f552f5a7337527098fe0216d97"} Mar 08 00:32:15.642831 master-0 kubenswrapper[23041]: I0308 00:32:15.642825 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs" Mar 08 00:32:15.684554 master-0 kubenswrapper[23041]: I0308 00:32:15.684457 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:32:15.694835 master-0 kubenswrapper[23041]: I0308 00:32:15.692664 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544c885f6d-dr4gh"] Mar 08 00:32:15.706328 master-0 kubenswrapper[23041]: I0308 00:32:15.705840 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:32:15.707564 master-0 kubenswrapper[23041]: I0308 00:32:15.707508 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b4bdf67b6-8rdjs"] Mar 08 00:32:15.809817 master-0 kubenswrapper[23041]: I0308 00:32:15.802794 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:15.809817 master-0 kubenswrapper[23041]: I0308 00:32:15.803606 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:32:15.809817 master-0 kubenswrapper[23041]: I0308 00:32:15.805250 23041 patch_prober.go:28] interesting pod/console-5c84b9c874-8xl2l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 08 00:32:15.809817 master-0 kubenswrapper[23041]: I0308 00:32:15.805324 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 08 00:32:15.853406 master-0 kubenswrapper[23041]: I0308 00:32:15.850493 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:16.147908 master-0 kubenswrapper[23041]: I0308 00:32:16.147841 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ddc94864c-7nwdc"] Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: E0308 00:32:16.148222 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: I0308 00:32:16.148238 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: E0308 00:32:16.148257 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: I0308 00:32:16.148264 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: E0308 00:32:16.148282 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70892c23-554d-466c-a526-90a799439fe0" containerName="route-controller-manager" Mar 08 00:32:16.148278 master-0 kubenswrapper[23041]: I0308 00:32:16.148291 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="70892c23-554d-466c-a526-90a799439fe0" containerName="route-controller-manager" Mar 08 00:32:16.148552 master-0 kubenswrapper[23041]: I0308 00:32:16.148458 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="70892c23-554d-466c-a526-90a799439fe0" containerName="route-controller-manager" Mar 08 00:32:16.148552 master-0 kubenswrapper[23041]: I0308 00:32:16.148476 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.152123 master-0 kubenswrapper[23041]: I0308 00:32:16.149031 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.157606 master-0 kubenswrapper[23041]: I0308 00:32:16.157531 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:32:16.158131 master-0 kubenswrapper[23041]: I0308 00:32:16.158096 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:32:16.160358 master-0 kubenswrapper[23041]: I0308 00:32:16.160279 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:32:16.160679 master-0 kubenswrapper[23041]: I0308 00:32:16.160391 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-7jhj9" Mar 08 00:32:16.160679 master-0 kubenswrapper[23041]: I0308 00:32:16.160520 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:32:16.164189 master-0 kubenswrapper[23041]: I0308 00:32:16.162471 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:32:16.166889 master-0 kubenswrapper[23041]: I0308 00:32:16.164931 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:32:16.170576 master-0 kubenswrapper[23041]: I0308 00:32:16.169349 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b"] Mar 08 00:32:16.176048 master-0 kubenswrapper[23041]: I0308 00:32:16.176001 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" containerName="controller-manager" Mar 08 00:32:16.176663 master-0 kubenswrapper[23041]: I0308 00:32:16.176644 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ddc94864c-7nwdc"] Mar 08 00:32:16.176663 master-0 kubenswrapper[23041]: I0308 00:32:16.176669 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b"] Mar 08 00:32:16.176802 master-0 kubenswrapper[23041]: I0308 00:32:16.176770 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.183313 master-0 kubenswrapper[23041]: I0308 00:32:16.181904 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:32:16.183313 master-0 kubenswrapper[23041]: I0308 00:32:16.182128 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:32:16.183313 master-0 kubenswrapper[23041]: I0308 00:32:16.182274 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:32:16.184384 master-0 kubenswrapper[23041]: I0308 00:32:16.183449 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:32:16.184384 master-0 kubenswrapper[23041]: I0308 00:32:16.183493 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-ldbrl" Mar 08 00:32:16.186090 master-0 kubenswrapper[23041]: I0308 00:32:16.185260 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207269 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-config\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207315 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-client-ca\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207334 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef25237-ab1c-41a6-a0a7-07642094de26-serving-cert\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207351 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-client-ca\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207375 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08cae160-6592-4936-aa7b-0880f243d35d-serving-cert\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207392 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-config\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207427 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-proxy-ca-bundles\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207457 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kh8t\" (UniqueName: \"kubernetes.io/projected/2ef25237-ab1c-41a6-a0a7-07642094de26-kube-api-access-4kh8t\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.215838 master-0 kubenswrapper[23041]: I0308 00:32:16.207496 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/08cae160-6592-4936-aa7b-0880f243d35d-kube-api-access-kr6zr\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332466 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-config\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332580 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-client-ca\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332605 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef25237-ab1c-41a6-a0a7-07642094de26-serving-cert\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332642 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-client-ca\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332688 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08cae160-6592-4936-aa7b-0880f243d35d-serving-cert\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332714 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-config\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332807 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-proxy-ca-bundles\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332881 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kh8t\" (UniqueName: \"kubernetes.io/projected/2ef25237-ab1c-41a6-a0a7-07642094de26-kube-api-access-4kh8t\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.332982 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/08cae160-6592-4936-aa7b-0880f243d35d-kube-api-access-kr6zr\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.336566 master-0 kubenswrapper[23041]: I0308 00:32:16.334897 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-config\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.338005 master-0 kubenswrapper[23041]: I0308 00:32:16.337974 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08cae160-6592-4936-aa7b-0880f243d35d-client-ca\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.340027 master-0 kubenswrapper[23041]: I0308 00:32:16.339961 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-client-ca\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.342583 master-0 kubenswrapper[23041]: I0308 00:32:16.340758 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08cae160-6592-4936-aa7b-0880f243d35d-serving-cert\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.342583 master-0 kubenswrapper[23041]: I0308 00:32:16.340819 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-proxy-ca-bundles\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.342583 master-0 kubenswrapper[23041]: I0308 00:32:16.341549 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef25237-ab1c-41a6-a0a7-07642094de26-config\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.344126 master-0 kubenswrapper[23041]: I0308 00:32:16.343957 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef25237-ab1c-41a6-a0a7-07642094de26-serving-cert\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.362234 master-0 kubenswrapper[23041]: I0308 00:32:16.359634 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:32:16.362234 master-0 kubenswrapper[23041]: I0308 00:32:16.359807 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:32:16.365540 master-0 kubenswrapper[23041]: I0308 00:32:16.365497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/08cae160-6592-4936-aa7b-0880f243d35d-kube-api-access-kr6zr\") pod \"route-controller-manager-5d647dccbb-6cz8b\" (UID: \"08cae160-6592-4936-aa7b-0880f243d35d\") " pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.371600 master-0 kubenswrapper[23041]: I0308 00:32:16.370693 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kh8t\" (UniqueName: \"kubernetes.io/projected/2ef25237-ab1c-41a6-a0a7-07642094de26-kube-api-access-4kh8t\") pod \"controller-manager-5ddc94864c-7nwdc\" (UID: \"2ef25237-ab1c-41a6-a0a7-07642094de26\") " pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.404063 master-0 kubenswrapper[23041]: I0308 00:32:16.404015 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:16.507104 master-0 kubenswrapper[23041]: I0308 00:32:16.505809 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:32:16.507525 master-0 kubenswrapper[23041]: I0308 00:32:16.507381 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.511160 master-0 kubenswrapper[23041]: I0308 00:32:16.511103 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:16.522777 master-0 kubenswrapper[23041]: I0308 00:32:16.522725 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:32:16.530244 master-0 kubenswrapper[23041]: I0308 00:32:16.529686 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.537949 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538075 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538112 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538186 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzhf\" (UniqueName: \"kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538263 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538383 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.539067 master-0 kubenswrapper[23041]: I0308 00:32:16.538454 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.641289 master-0 kubenswrapper[23041]: I0308 00:32:16.640161 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.641289 master-0 kubenswrapper[23041]: I0308 00:32:16.640297 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.641780 master-0 kubenswrapper[23041]: I0308 00:32:16.641442 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.642754 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.643859 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.644056 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.644089 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.644144 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzhf\" (UniqueName: \"kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.644193 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.645385 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.646608 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.649534 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.654305 master-0 kubenswrapper[23041]: I0308 00:32:16.653416 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.673731 master-0 kubenswrapper[23041]: I0308 00:32:16.673681 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzhf\" (UniqueName: \"kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf\") pod \"console-76c777474b-n9mhf\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:16.825023 master-0 kubenswrapper[23041]: I0308 00:32:16.824958 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70892c23-554d-466c-a526-90a799439fe0" path="/var/lib/kubelet/pods/70892c23-554d-466c-a526-90a799439fe0/volumes" Mar 08 00:32:16.825613 master-0 kubenswrapper[23041]: I0308 00:32:16.825592 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbcb0196-be5c-44a4-9749-5df9fbeaa718" path="/var/lib/kubelet/pods/cbcb0196-be5c-44a4-9749-5df9fbeaa718/volumes" Mar 08 00:32:16.837314 master-0 kubenswrapper[23041]: I0308 00:32:16.837264 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:17.385728 master-0 kubenswrapper[23041]: I0308 00:32:17.385677 23041 scope.go:117] "RemoveContainer" containerID="ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd" Mar 08 00:32:17.386355 master-0 kubenswrapper[23041]: E0308 00:32:17.386316 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd\": container with ID starting with ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd not found: ID does not exist" containerID="ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd" Mar 08 00:32:17.386440 master-0 kubenswrapper[23041]: I0308 00:32:17.386371 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd"} err="failed to get container status \"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd\": rpc error: code = NotFound desc = could not find container \"ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd\": container with ID starting with ae7bb35d674e364ba7abb3d8a4e36b86062b8a56cb462417c0258160c034b1cd not found: ID does not exist" Mar 08 00:32:17.386440 master-0 kubenswrapper[23041]: I0308 00:32:17.386413 23041 scope.go:117] "RemoveContainer" containerID="a89aafabc1e522f342463d98f2fa1cfd6a92e881b88c10677cf22bc178649255" Mar 08 00:32:18.544146 master-0 kubenswrapper[23041]: I0308 00:32:18.544083 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b"] Mar 08 00:32:18.680527 master-0 kubenswrapper[23041]: I0308 00:32:18.680445 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" event={"ID":"08cae160-6592-4936-aa7b-0880f243d35d","Type":"ContainerStarted","Data":"bb7b336d6d05de5a0ac8d090a08f4f55ea9d1d7aae3f3ddbdb494f56c94eadbe"} Mar 08 00:32:18.683092 master-0 kubenswrapper[23041]: I0308 00:32:18.683042 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} Mar 08 00:32:18.683180 master-0 kubenswrapper[23041]: I0308 00:32:18.683098 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} Mar 08 00:32:19.145388 master-0 kubenswrapper[23041]: I0308 00:32:19.143807 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ddc94864c-7nwdc"] Mar 08 00:32:19.157260 master-0 kubenswrapper[23041]: I0308 00:32:19.154746 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:32:19.703070 master-0 kubenswrapper[23041]: I0308 00:32:19.702979 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} Mar 08 00:32:19.704902 master-0 kubenswrapper[23041]: I0308 00:32:19.704848 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c777474b-n9mhf" event={"ID":"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca","Type":"ContainerStarted","Data":"408aa627a4a08d14d63ec8aea7bfc7777355f299cda7fec6881e53324a2338e6"} Mar 08 00:32:19.706750 master-0 kubenswrapper[23041]: I0308 00:32:19.706700 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" event={"ID":"08cae160-6592-4936-aa7b-0880f243d35d","Type":"ContainerStarted","Data":"da5e3c1311411f7f0de945be6dbaadde3cd76eb4fd1a973e31ce6269b5ad37e8"} Mar 08 00:32:19.707027 master-0 kubenswrapper[23041]: I0308 00:32:19.706983 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:19.708408 master-0 kubenswrapper[23041]: I0308 00:32:19.708109 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" event={"ID":"2ef25237-ab1c-41a6-a0a7-07642094de26","Type":"ContainerStarted","Data":"f6365e505366ec41e1d8493468c3de2f623d6298fe0f596459357802845842ee"} Mar 08 00:32:19.708408 master-0 kubenswrapper[23041]: I0308 00:32:19.708135 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" event={"ID":"2ef25237-ab1c-41a6-a0a7-07642094de26","Type":"ContainerStarted","Data":"890b52d483f8c7cf49ca7cc0a2f03f26149e72d192d9e4d83cfb04307f9f3175"} Mar 08 00:32:20.473241 master-0 kubenswrapper[23041]: I0308 00:32:20.468258 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" podStartSLOduration=6.468241428 podStartE2EDuration="6.468241428s" podCreationTimestamp="2026-03-08 00:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:20.467688544 +0000 UTC m=+45.940525118" watchObservedRunningTime="2026-03-08 00:32:20.468241428 +0000 UTC m=+45.941077982" Mar 08 00:32:20.707451 master-0 kubenswrapper[23041]: I0308 00:32:20.707373 23041 patch_prober.go:28] interesting pod/route-controller-manager-5d647dccbb-6cz8b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.94:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:32:20.707451 master-0 kubenswrapper[23041]: I0308 00:32:20.707448 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" podUID="08cae160-6592-4936-aa7b-0880f243d35d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.94:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:32:20.723912 master-0 kubenswrapper[23041]: I0308 00:32:20.723750 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c777474b-n9mhf" event={"ID":"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca","Type":"ContainerStarted","Data":"fb16edc75fe138e856a4f392208e9dde4e0eff1ea9fd011ed9da97c48fdc468f"} Mar 08 00:32:20.724597 master-0 kubenswrapper[23041]: I0308 00:32:20.724557 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:20.729777 master-0 kubenswrapper[23041]: I0308 00:32:20.729736 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:32:21.724250 master-0 kubenswrapper[23041]: I0308 00:32:21.724131 23041 patch_prober.go:28] interesting pod/route-controller-manager-5d647dccbb-6cz8b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.94:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:32:21.724780 master-0 kubenswrapper[23041]: I0308 00:32:21.724275 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" podUID="08cae160-6592-4936-aa7b-0880f243d35d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.94:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:32:21.738853 master-0 kubenswrapper[23041]: I0308 00:32:21.738775 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} Mar 08 00:32:21.738853 master-0 kubenswrapper[23041]: I0308 00:32:21.738857 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} Mar 08 00:32:22.772109 master-0 kubenswrapper[23041]: I0308 00:32:22.772059 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerStarted","Data":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} Mar 08 00:32:23.237076 master-0 kubenswrapper[23041]: I0308 00:32:23.236872 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c777474b-n9mhf" podStartSLOduration=7.236845046 podStartE2EDuration="7.236845046s" podCreationTimestamp="2026-03-08 00:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:22.357897079 +0000 UTC m=+47.830733653" watchObservedRunningTime="2026-03-08 00:32:23.236845046 +0000 UTC m=+48.709681610" Mar 08 00:32:24.570704 master-0 kubenswrapper[23041]: I0308 00:32:24.570609 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" podStartSLOduration=10.570584556 podStartE2EDuration="10.570584556s" podCreationTimestamp="2026-03-08 00:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:23.249999345 +0000 UTC m=+48.722835919" watchObservedRunningTime="2026-03-08 00:32:24.570584556 +0000 UTC m=+50.043421110" Mar 08 00:32:25.128255 master-0 kubenswrapper[23041]: I0308 00:32:25.127915 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:32:25.130350 master-0 kubenswrapper[23041]: E0308 00:32:25.128815 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:25.130350 master-0 kubenswrapper[23041]: E0308 00:32:25.128900 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:32:57.128866978 +0000 UTC m=+82.601703532 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:25.468824 master-0 kubenswrapper[23041]: I0308 00:32:25.467467 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:25.809974 master-0 kubenswrapper[23041]: I0308 00:32:25.808576 23041 patch_prober.go:28] interesting pod/console-5c84b9c874-8xl2l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 08 00:32:25.809974 master-0 kubenswrapper[23041]: I0308 00:32:25.808684 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 08 00:32:26.409463 master-0 kubenswrapper[23041]: I0308 00:32:26.409395 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d647dccbb-6cz8b" Mar 08 00:32:26.433282 master-0 kubenswrapper[23041]: I0308 00:32:26.433148 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=11.544064911 podStartE2EDuration="16.433118648s" podCreationTimestamp="2026-03-08 00:32:10 +0000 UTC" firstStartedPulling="2026-03-08 00:32:12.593551139 +0000 UTC m=+38.066387693" lastFinishedPulling="2026-03-08 00:32:17.482604876 +0000 UTC m=+42.955441430" observedRunningTime="2026-03-08 00:32:24.684488956 +0000 UTC m=+50.157325530" watchObservedRunningTime="2026-03-08 00:32:26.433118648 +0000 UTC m=+51.905955202" Mar 08 00:32:26.838725 master-0 kubenswrapper[23041]: I0308 00:32:26.838656 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:26.839476 master-0 kubenswrapper[23041]: I0308 00:32:26.838761 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:32:26.841568 master-0 kubenswrapper[23041]: I0308 00:32:26.841511 23041 patch_prober.go:28] interesting pod/console-76c777474b-n9mhf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 08 00:32:26.841666 master-0 kubenswrapper[23041]: I0308 00:32:26.841610 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76c777474b-n9mhf" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 08 00:32:27.109611 master-0 kubenswrapper[23041]: I0308 00:32:27.109474 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:27.140134 master-0 kubenswrapper[23041]: I0308 00:32:27.140078 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:27.863762 master-0 kubenswrapper[23041]: I0308 00:32:27.863666 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:32:28.453790 master-0 kubenswrapper[23041]: I0308 00:32:28.453701 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:32:28.454152 master-0 kubenswrapper[23041]: I0308 00:32:28.454020 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="0133db83-1083-4458-86d4-49e431dd4365" containerName="installer" containerID="cri-o://6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493" gracePeriod=30 Mar 08 00:32:33.471599 master-0 kubenswrapper[23041]: I0308 00:32:33.471497 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:32:33.472543 master-0 kubenswrapper[23041]: I0308 00:32:33.472492 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.483713 master-0 kubenswrapper[23041]: I0308 00:32:33.483324 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:32:33.530338 master-0 kubenswrapper[23041]: I0308 00:32:33.530280 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.530620 master-0 kubenswrapper[23041]: I0308 00:32:33.530352 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.530620 master-0 kubenswrapper[23041]: I0308 00:32:33.530533 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.634146 master-0 kubenswrapper[23041]: I0308 00:32:33.634050 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.634461 master-0 kubenswrapper[23041]: I0308 00:32:33.634187 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.634461 master-0 kubenswrapper[23041]: I0308 00:32:33.634232 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.634461 master-0 kubenswrapper[23041]: I0308 00:32:33.634249 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.634461 master-0 kubenswrapper[23041]: I0308 00:32:33.634338 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.654699 master-0 kubenswrapper[23041]: I0308 00:32:33.654651 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access\") pod \"installer-3-master-0\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:33.810576 master-0 kubenswrapper[23041]: I0308 00:32:33.810505 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:32:35.803715 master-0 kubenswrapper[23041]: I0308 00:32:35.803351 23041 patch_prober.go:28] interesting pod/console-5c84b9c874-8xl2l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 08 00:32:35.803715 master-0 kubenswrapper[23041]: I0308 00:32:35.803425 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 08 00:32:36.364367 master-0 kubenswrapper[23041]: I0308 00:32:36.364281 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:32:36.370915 master-0 kubenswrapper[23041]: I0308 00:32:36.370866 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b45f5889c-z48tj" Mar 08 00:32:36.838910 master-0 kubenswrapper[23041]: I0308 00:32:36.838855 23041 patch_prober.go:28] interesting pod/console-76c777474b-n9mhf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 08 00:32:36.840256 master-0 kubenswrapper[23041]: I0308 00:32:36.838920 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76c777474b-n9mhf" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 08 00:32:40.878374 master-0 kubenswrapper[23041]: I0308 00:32:40.878272 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" podUID="1f437239-11ff-4333-bdea-8a48b8ac95e8" containerName="oauth-openshift" containerID="cri-o://4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82" gracePeriod=15 Mar 08 00:32:42.116702 master-0 kubenswrapper[23041]: I0308 00:32:42.116642 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:32:42.117785 master-0 kubenswrapper[23041]: I0308 00:32:42.117716 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="alertmanager" containerID="cri-o://3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" gracePeriod=120 Mar 08 00:32:42.118188 master-0 kubenswrapper[23041]: I0308 00:32:42.118148 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy" containerID="cri-o://4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" gracePeriod=120 Mar 08 00:32:42.118307 master-0 kubenswrapper[23041]: I0308 00:32:42.118243 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="prom-label-proxy" containerID="cri-o://77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" gracePeriod=120 Mar 08 00:32:42.118307 master-0 kubenswrapper[23041]: I0308 00:32:42.118209 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="config-reloader" containerID="cri-o://c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" gracePeriod=120 Mar 08 00:32:42.118307 master-0 kubenswrapper[23041]: I0308 00:32:42.118280 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-metric" containerID="cri-o://3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" gracePeriod=120 Mar 08 00:32:42.118446 master-0 kubenswrapper[23041]: I0308 00:32:42.118329 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-web" containerID="cri-o://6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" gracePeriod=120 Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: I0308 00:32:42.159403 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: I0308 00:32:42.159479 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: E0308 00:32:42.160793 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: I0308 00:32:42.160893 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: E0308 00:32:42.160941 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.162759 master-0 kubenswrapper[23041]: I0308 00:32:42.160951 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.164448 master-0 kubenswrapper[23041]: I0308 00:32:42.164361 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" gracePeriod=30 Mar 08 00:32:42.164626 master-0 kubenswrapper[23041]: I0308 00:32:42.164600 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" gracePeriod=30 Mar 08 00:32:42.173473 master-0 kubenswrapper[23041]: I0308 00:32:42.173283 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.173473 master-0 kubenswrapper[23041]: I0308 00:32:42.173338 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 00:32:42.173473 master-0 kubenswrapper[23041]: I0308 00:32:42.173365 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.173775 master-0 kubenswrapper[23041]: E0308 00:32:42.173748 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.173775 master-0 kubenswrapper[23041]: I0308 00:32:42.173767 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 00:32:42.191117 master-0 kubenswrapper[23041]: I0308 00:32:42.191069 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.296402 master-0 kubenswrapper[23041]: I0308 00:32:42.294330 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.296402 master-0 kubenswrapper[23041]: I0308 00:32:42.294409 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.392498 master-0 kubenswrapper[23041]: I0308 00:32:42.392463 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:32:42.396368 master-0 kubenswrapper[23041]: I0308 00:32:42.396326 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.396564 master-0 kubenswrapper[23041]: I0308 00:32:42.396394 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.396564 master-0 kubenswrapper[23041]: I0308 00:32:42.396519 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.396764 master-0 kubenswrapper[23041]: I0308 00:32:42.396593 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.400265 master-0 kubenswrapper[23041]: I0308 00:32:42.400158 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:32:42.417628 master-0 kubenswrapper[23041]: I0308 00:32:42.417588 23041 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="cbacbe5f-8168-4f00-84c1-d35e4cd4bddb" Mar 08 00:32:42.497595 master-0 kubenswrapper[23041]: I0308 00:32:42.497535 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 00:32:42.497595 master-0 kubenswrapper[23041]: I0308 00:32:42.497585 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 00:32:42.497962 master-0 kubenswrapper[23041]: I0308 00:32:42.497643 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 00:32:42.497962 master-0 kubenswrapper[23041]: I0308 00:32:42.497666 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 00:32:42.497962 master-0 kubenswrapper[23041]: I0308 00:32:42.497730 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 00:32:42.498148 master-0 kubenswrapper[23041]: I0308 00:32:42.498118 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.498216 master-0 kubenswrapper[23041]: I0308 00:32:42.498152 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.498216 master-0 kubenswrapper[23041]: I0308 00:32:42.498172 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.498318 master-0 kubenswrapper[23041]: I0308 00:32:42.498196 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.498318 master-0 kubenswrapper[23041]: I0308 00:32:42.498243 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.501442 master-0 kubenswrapper[23041]: I0308 00:32:42.501402 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:42.581428 master-0 kubenswrapper[23041]: I0308 00:32:42.581375 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj"] Mar 08 00:32:42.584072 master-0 kubenswrapper[23041]: E0308 00:32:42.581726 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f437239-11ff-4333-bdea-8a48b8ac95e8" containerName="oauth-openshift" Mar 08 00:32:42.584072 master-0 kubenswrapper[23041]: I0308 00:32:42.581742 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f437239-11ff-4333-bdea-8a48b8ac95e8" containerName="oauth-openshift" Mar 08 00:32:42.584072 master-0 kubenswrapper[23041]: I0308 00:32:42.581932 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f437239-11ff-4333-bdea-8a48b8ac95e8" containerName="oauth-openshift" Mar 08 00:32:42.584072 master-0 kubenswrapper[23041]: I0308 00:32:42.582541 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.586907 master-0 kubenswrapper[23041]: I0308 00:32:42.586877 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_0133db83-1083-4458-86d4-49e431dd4365/installer/0.log" Mar 08 00:32:42.587144 master-0 kubenswrapper[23041]: I0308 00:32:42.586947 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:32:42.598272 master-0 kubenswrapper[23041]: I0308 00:32:42.597978 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj"] Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601580 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dmg\" (UniqueName: \"kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601702 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601770 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601810 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601878 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601904 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601935 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.601970 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.602028 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.602099 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.602130 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.602154 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.602195 master-0 kubenswrapper[23041]: I0308 00:32:42.602191 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login\") pod \"1f437239-11ff-4333-bdea-8a48b8ac95e8\" (UID: \"1f437239-11ff-4333-bdea-8a48b8ac95e8\") " Mar 08 00:32:42.604015 master-0 kubenswrapper[23041]: I0308 00:32:42.603152 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.604015 master-0 kubenswrapper[23041]: I0308 00:32:42.603181 23041 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.604015 master-0 kubenswrapper[23041]: I0308 00:32:42.603193 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.604015 master-0 kubenswrapper[23041]: I0308 00:32:42.603238 23041 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.604015 master-0 kubenswrapper[23041]: I0308 00:32:42.603251 23041 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.612341 master-0 kubenswrapper[23041]: I0308 00:32:42.611797 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.613008 master-0 kubenswrapper[23041]: I0308 00:32:42.612970 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:32:42.613884 master-0 kubenswrapper[23041]: I0308 00:32:42.613837 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:42.614170 master-0 kubenswrapper[23041]: I0308 00:32:42.614129 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:42.614521 master-0 kubenswrapper[23041]: I0308 00:32:42.614490 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:42.614589 master-0 kubenswrapper[23041]: I0308 00:32:42.614528 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.614627 master-0 kubenswrapper[23041]: I0308 00:32:42.614590 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:42.619616 master-0 kubenswrapper[23041]: I0308 00:32:42.615622 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg" (OuterVolumeSpecName: "kube-api-access-24dmg") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "kube-api-access-24dmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:42.619616 master-0 kubenswrapper[23041]: I0308 00:32:42.619149 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.620030 master-0 kubenswrapper[23041]: I0308 00:32:42.619853 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.620030 master-0 kubenswrapper[23041]: I0308 00:32:42.619872 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.623584 master-0 kubenswrapper[23041]: W0308 00:32:42.623416 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b0395d1_7cb0_4857_891a_68f88a6fd468.slice/crio-12650fbaafb1ba4d82ecfbd4886d512f14868b88e1328c1261518868d687ae5b WatchSource:0}: Error finding container 12650fbaafb1ba4d82ecfbd4886d512f14868b88e1328c1261518868d687ae5b: Status 404 returned error can't find the container with id 12650fbaafb1ba4d82ecfbd4886d512f14868b88e1328c1261518868d687ae5b Mar 08 00:32:42.625878 master-0 kubenswrapper[23041]: I0308 00:32:42.625074 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.625878 master-0 kubenswrapper[23041]: I0308 00:32:42.625229 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.632611 master-0 kubenswrapper[23041]: I0308 00:32:42.632560 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1f437239-11ff-4333-bdea-8a48b8ac95e8" (UID: "1f437239-11ff-4333-bdea-8a48b8ac95e8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:42.691709 master-0 kubenswrapper[23041]: I0308 00:32:42.691652 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:42.703939 master-0 kubenswrapper[23041]: I0308 00:32:42.703847 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir\") pod \"0133db83-1083-4458-86d4-49e431dd4365\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " Mar 08 00:32:42.704088 master-0 kubenswrapper[23041]: I0308 00:32:42.703937 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0133db83-1083-4458-86d4-49e431dd4365" (UID: "0133db83-1083-4458-86d4-49e431dd4365"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.704088 master-0 kubenswrapper[23041]: I0308 00:32:42.703996 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access\") pod \"0133db83-1083-4458-86d4-49e431dd4365\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " Mar 08 00:32:42.704174 master-0 kubenswrapper[23041]: I0308 00:32:42.704161 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock\") pod \"0133db83-1083-4458-86d4-49e431dd4365\" (UID: \"0133db83-1083-4458-86d4-49e431dd4365\") " Mar 08 00:32:42.704421 master-0 kubenswrapper[23041]: I0308 00:32:42.704394 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704460 master-0 kubenswrapper[23041]: I0308 00:32:42.704446 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704495 master-0 kubenswrapper[23041]: I0308 00:32:42.704467 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704528 master-0 kubenswrapper[23041]: I0308 00:32:42.704502 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704528 master-0 kubenswrapper[23041]: I0308 00:32:42.704522 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704585 master-0 kubenswrapper[23041]: I0308 00:32:42.704549 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704585 master-0 kubenswrapper[23041]: I0308 00:32:42.704570 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704645 master-0 kubenswrapper[23041]: I0308 00:32:42.704587 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-dir\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704645 master-0 kubenswrapper[23041]: I0308 00:32:42.704612 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkc8b\" (UniqueName: \"kubernetes.io/projected/432f3e5b-122a-44f1-91b9-7c1399927a7a-kube-api-access-tkc8b\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704645 master-0 kubenswrapper[23041]: I0308 00:32:42.704630 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-policies\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704734 master-0 kubenswrapper[23041]: I0308 00:32:42.704653 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704734 master-0 kubenswrapper[23041]: I0308 00:32:42.704672 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704734 master-0 kubenswrapper[23041]: I0308 00:32:42.704698 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-session\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704740 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704752 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704762 23041 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704772 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704782 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704793 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24dmg\" (UniqueName: \"kubernetes.io/projected/1f437239-11ff-4333-bdea-8a48b8ac95e8-kube-api-access-24dmg\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704804 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.704814 master-0 kubenswrapper[23041]: I0308 00:32:42.704814 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704824 23041 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f437239-11ff-4333-bdea-8a48b8ac95e8-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704834 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704843 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704852 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704861 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705069 master-0 kubenswrapper[23041]: I0308 00:32:42.704869 23041 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f437239-11ff-4333-bdea-8a48b8ac95e8-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.705324 master-0 kubenswrapper[23041]: I0308 00:32:42.705219 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock" (OuterVolumeSpecName: "var-lock") pod "0133db83-1083-4458-86d4-49e431dd4365" (UID: "0133db83-1083-4458-86d4-49e431dd4365"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:42.707635 master-0 kubenswrapper[23041]: I0308 00:32:42.707608 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0133db83-1083-4458-86d4-49e431dd4365" (UID: "0133db83-1083-4458-86d4-49e431dd4365"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:42.824210 master-0 kubenswrapper[23041]: I0308 00:32:42.824133 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkc8b\" (UniqueName: \"kubernetes.io/projected/432f3e5b-122a-44f1-91b9-7c1399927a7a-kube-api-access-tkc8b\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.824210 master-0 kubenswrapper[23041]: I0308 00:32:42.824193 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-policies\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.824460 master-0 kubenswrapper[23041]: I0308 00:32:42.824248 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.824460 master-0 kubenswrapper[23041]: I0308 00:32:42.824281 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825110 master-0 kubenswrapper[23041]: I0308 00:32:42.825059 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-policies\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825172 master-0 kubenswrapper[23041]: I0308 00:32:42.825147 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-session\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825342 master-0 kubenswrapper[23041]: I0308 00:32:42.825314 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825447 master-0 kubenswrapper[23041]: I0308 00:32:42.825419 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825498 master-0 kubenswrapper[23041]: I0308 00:32:42.825469 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825547 master-0 kubenswrapper[23041]: I0308 00:32:42.825525 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825591 master-0 kubenswrapper[23041]: I0308 00:32:42.825566 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825633 master-0 kubenswrapper[23041]: I0308 00:32:42.825610 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825682 master-0 kubenswrapper[23041]: I0308 00:32:42.825651 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825732 master-0 kubenswrapper[23041]: I0308 00:32:42.825683 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-dir\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.825773 master-0 kubenswrapper[23041]: I0308 00:32:42.825764 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0133db83-1083-4458-86d4-49e431dd4365-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.825813 master-0 kubenswrapper[23041]: I0308 00:32:42.825782 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0133db83-1083-4458-86d4-49e431dd4365-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:42.830632 master-0 kubenswrapper[23041]: I0308 00:32:42.830591 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.833366 master-0 kubenswrapper[23041]: I0308 00:32:42.833275 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.835679 master-0 kubenswrapper[23041]: I0308 00:32:42.835640 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.836839 master-0 kubenswrapper[23041]: I0308 00:32:42.836804 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.837303 master-0 kubenswrapper[23041]: I0308 00:32:42.837267 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.843457 master-0 kubenswrapper[23041]: I0308 00:32:42.843352 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/432f3e5b-122a-44f1-91b9-7c1399927a7a-audit-dir\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.844120 master-0 kubenswrapper[23041]: I0308 00:32:42.844075 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.847001 master-0 kubenswrapper[23041]: I0308 00:32:42.846950 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.847681 master-0 kubenswrapper[23041]: I0308 00:32:42.847649 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-session\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.855955 master-0 kubenswrapper[23041]: I0308 00:32:42.855917 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.856125 master-0 kubenswrapper[23041]: I0308 00:32:42.856097 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/432f3e5b-122a-44f1-91b9-7c1399927a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:42.858315 master-0 kubenswrapper[23041]: I0308 00:32:42.858132 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 08 00:32:42.858791 master-0 kubenswrapper[23041]: I0308 00:32:42.858757 23041 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 08 00:32:42.931422 master-0 kubenswrapper[23041]: I0308 00:32:42.931379 23041 generic.go:334] "Generic (PLEG): container finished" podID="1f437239-11ff-4333-bdea-8a48b8ac95e8" containerID="4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82" exitCode=0 Mar 08 00:32:42.931581 master-0 kubenswrapper[23041]: I0308 00:32:42.931488 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" Mar 08 00:32:42.933348 master-0 kubenswrapper[23041]: I0308 00:32:42.933314 23041 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" exitCode=0 Mar 08 00:32:42.933348 master-0 kubenswrapper[23041]: I0308 00:32:42.933345 23041 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" exitCode=0 Mar 08 00:32:42.933496 master-0 kubenswrapper[23041]: I0308 00:32:42.933455 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 00:32:42.940377 master-0 kubenswrapper[23041]: I0308 00:32:42.940330 23041 patch_prober.go:28] interesting pod/downloads-84f57b9877-8g27w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 08 00:32:42.940551 master-0 kubenswrapper[23041]: I0308 00:32:42.940408 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-8g27w" podUID="414dbe5d-16a5-4765-9dc5-d50c0784ace7" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 08 00:32:42.958413 master-0 kubenswrapper[23041]: I0308 00:32:42.958267 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" exitCode=0 Mar 08 00:32:42.958413 master-0 kubenswrapper[23041]: I0308 00:32:42.958407 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" exitCode=0 Mar 08 00:32:42.958648 master-0 kubenswrapper[23041]: I0308 00:32:42.958419 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" exitCode=0 Mar 08 00:32:42.958648 master-0 kubenswrapper[23041]: I0308 00:32:42.958430 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" exitCode=0 Mar 08 00:32:42.958648 master-0 kubenswrapper[23041]: I0308 00:32:42.958542 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" exitCode=0 Mar 08 00:32:42.961896 master-0 kubenswrapper[23041]: I0308 00:32:42.961867 23041 generic.go:334] "Generic (PLEG): container finished" podID="48fcdccb-478e-4027-b4b9-9a061439f0e2" containerID="0cbd1fb0e210283b6ccd115d8b0aae824719f69eb7b32ccc55a017669c4605fa" exitCode=0 Mar 08 00:32:42.964828 master-0 kubenswrapper[23041]: I0308 00:32:42.964793 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_0133db83-1083-4458-86d4-49e431dd4365/installer/0.log" Mar 08 00:32:42.964933 master-0 kubenswrapper[23041]: I0308 00:32:42.964845 23041 generic.go:334] "Generic (PLEG): container finished" podID="0133db83-1083-4458-86d4-49e431dd4365" containerID="6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493" exitCode=1 Mar 08 00:32:42.965037 master-0 kubenswrapper[23041]: I0308 00:32:42.965011 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 00:32:43.419004 master-0 kubenswrapper[23041]: I0308 00:32:43.418962 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkc8b\" (UniqueName: \"kubernetes.io/projected/432f3e5b-122a-44f1-91b9-7c1399927a7a-kube-api-access-tkc8b\") pod \"oauth-openshift-5b6fc868c6-zc2fj\" (UID: \"432f3e5b-122a-44f1-91b9-7c1399927a7a\") " pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:43.523959 master-0 kubenswrapper[23041]: I0308 00:32:43.523905 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:43.975358 master-0 kubenswrapper[23041]: I0308 00:32:43.975290 23041 generic.go:334] "Generic (PLEG): container finished" podID="51c724a5-de89-4fde-b596-70157d2d19b6" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" exitCode=0 Mar 08 00:32:44.302467 master-0 kubenswrapper[23041]: I0308 00:32:44.302420 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:32:44.387884 master-0 kubenswrapper[23041]: E0308 00:32:44.387743 23041 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.563s" Mar 08 00:32:44.387884 master-0 kubenswrapper[23041]: I0308 00:32:44.387839 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 00:32:44.387884 master-0 kubenswrapper[23041]: I0308 00:32:44.387873 23041 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="cbacbe5f-8168-4f00-84c1-d35e4cd4bddb" Mar 08 00:32:44.388107 master-0 kubenswrapper[23041]: I0308 00:32:44.387956 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"8b0395d1-7cb0-4857-891a-68f88a6fd468","Type":"ContainerStarted","Data":"12650fbaafb1ba4d82ecfbd4886d512f14868b88e1328c1261518868d687ae5b"} Mar 08 00:32:44.388107 master-0 kubenswrapper[23041]: I0308 00:32:44.388025 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" event={"ID":"1f437239-11ff-4333-bdea-8a48b8ac95e8","Type":"ContainerDied","Data":"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82"} Mar 08 00:32:44.388107 master-0 kubenswrapper[23041]: I0308 00:32:44.388075 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6df5fc69d-thc6n" event={"ID":"1f437239-11ff-4333-bdea-8a48b8ac95e8","Type":"ContainerDied","Data":"cbc291543364b486a08967ccc4804b1fae56cde3e2f919af998f09716f2e7340"} Mar 08 00:32:44.388273 master-0 kubenswrapper[23041]: I0308 00:32:44.388132 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:44.388273 master-0 kubenswrapper[23041]: I0308 00:32:44.388169 23041 scope.go:117] "RemoveContainer" containerID="4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82" Mar 08 00:32:44.388423 master-0 kubenswrapper[23041]: I0308 00:32:44.388185 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-8g27w" event={"ID":"414dbe5d-16a5-4765-9dc5-d50c0784ace7","Type":"ContainerStarted","Data":"a94e90375d7c0e47f489dd8b3bb19d72423ed21d1862696817f0b5d82db0a258"} Mar 08 00:32:44.388470 master-0 kubenswrapper[23041]: I0308 00:32:44.388450 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} Mar 08 00:32:44.388503 master-0 kubenswrapper[23041]: I0308 00:32:44.388488 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} Mar 08 00:32:44.388539 master-0 kubenswrapper[23041]: I0308 00:32:44.388509 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} Mar 08 00:32:44.388539 master-0 kubenswrapper[23041]: I0308 00:32:44.388533 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} Mar 08 00:32:44.388597 master-0 kubenswrapper[23041]: I0308 00:32:44.388549 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} Mar 08 00:32:44.388597 master-0 kubenswrapper[23041]: I0308 00:32:44.388565 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"c07a90a6c73bfcda370b5c481ce0b2be0b62854bd05e9b2afd0f6e1c4f249694"} Mar 08 00:32:44.388597 master-0 kubenswrapper[23041]: I0308 00:32:44.388580 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"48fcdccb-478e-4027-b4b9-9a061439f0e2","Type":"ContainerDied","Data":"0cbd1fb0e210283b6ccd115d8b0aae824719f69eb7b32ccc55a017669c4605fa"} Mar 08 00:32:44.388597 master-0 kubenswrapper[23041]: I0308 00:32:44.388596 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"0133db83-1083-4458-86d4-49e431dd4365","Type":"ContainerDied","Data":"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388612 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"0133db83-1083-4458-86d4-49e431dd4365","Type":"ContainerDied","Data":"d99bde344c0c744884322f50bc262e862d206379d4253c8485cb44806af03448"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388626 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388640 23041 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388652 23041 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388659 23041 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388666 23041 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388677 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"8b0395d1-7cb0-4857-891a-68f88a6fd468","Type":"ContainerStarted","Data":"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888"} Mar 08 00:32:44.388701 master-0 kubenswrapper[23041]: I0308 00:32:44.388689 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"098d17f749452ad5be8665c35010c152a45df97f56bb2ecbb202549c56ee2e8a"} Mar 08 00:32:44.390685 master-0 kubenswrapper[23041]: I0308 00:32:44.389421 23041 patch_prober.go:28] interesting pod/downloads-84f57b9877-8g27w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 08 00:32:44.390685 master-0 kubenswrapper[23041]: I0308 00:32:44.389486 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-8g27w" podUID="414dbe5d-16a5-4765-9dc5-d50c0784ace7" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 08 00:32:44.392847 master-0 kubenswrapper[23041]: I0308 00:32:44.392347 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 00:32:44.392847 master-0 kubenswrapper[23041]: I0308 00:32:44.392407 23041 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="cbacbe5f-8168-4f00-84c1-d35e4cd4bddb" Mar 08 00:32:44.411384 master-0 kubenswrapper[23041]: I0308 00:32:44.411339 23041 scope.go:117] "RemoveContainer" containerID="4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82" Mar 08 00:32:44.411931 master-0 kubenswrapper[23041]: E0308 00:32:44.411877 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82\": container with ID starting with 4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82 not found: ID does not exist" containerID="4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82" Mar 08 00:32:44.411997 master-0 kubenswrapper[23041]: I0308 00:32:44.411945 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82"} err="failed to get container status \"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82\": rpc error: code = NotFound desc = could not find container \"4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82\": container with ID starting with 4d97b024b5f9364312d8782f69ef5377574cad76ca1feb7a9771cf6068169e82 not found: ID does not exist" Mar 08 00:32:44.411997 master-0 kubenswrapper[23041]: I0308 00:32:44.411982 23041 scope.go:117] "RemoveContainer" containerID="0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" Mar 08 00:32:44.426121 master-0 kubenswrapper[23041]: I0308 00:32:44.426073 23041 scope.go:117] "RemoveContainer" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:32:44.440543 master-0 kubenswrapper[23041]: I0308 00:32:44.440498 23041 scope.go:117] "RemoveContainer" containerID="09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" Mar 08 00:32:44.453728 master-0 kubenswrapper[23041]: I0308 00:32:44.453696 23041 scope.go:117] "RemoveContainer" containerID="0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" Mar 08 00:32:44.454176 master-0 kubenswrapper[23041]: E0308 00:32:44.454127 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0\": container with ID starting with 0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0 not found: ID does not exist" containerID="0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" Mar 08 00:32:44.454289 master-0 kubenswrapper[23041]: I0308 00:32:44.454168 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0"} err="failed to get container status \"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0\": rpc error: code = NotFound desc = could not find container \"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0\": container with ID starting with 0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0 not found: ID does not exist" Mar 08 00:32:44.454289 master-0 kubenswrapper[23041]: I0308 00:32:44.454190 23041 scope.go:117] "RemoveContainer" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:32:44.454517 master-0 kubenswrapper[23041]: E0308 00:32:44.454489 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd\": container with ID starting with 1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd not found: ID does not exist" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:32:44.454586 master-0 kubenswrapper[23041]: I0308 00:32:44.454517 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd"} err="failed to get container status \"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd\": rpc error: code = NotFound desc = could not find container \"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd\": container with ID starting with 1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd not found: ID does not exist" Mar 08 00:32:44.454586 master-0 kubenswrapper[23041]: I0308 00:32:44.454542 23041 scope.go:117] "RemoveContainer" containerID="09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" Mar 08 00:32:44.454586 master-0 kubenswrapper[23041]: I0308 00:32:44.454539 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.454734 master-0 kubenswrapper[23041]: I0308 00:32:44.454593 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.454734 master-0 kubenswrapper[23041]: I0308 00:32:44.454714 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.454828 master-0 kubenswrapper[23041]: E0308 00:32:44.454778 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0\": container with ID starting with 09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0 not found: ID does not exist" containerID="09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" Mar 08 00:32:44.454828 master-0 kubenswrapper[23041]: I0308 00:32:44.454800 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0"} err="failed to get container status \"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0\": rpc error: code = NotFound desc = could not find container \"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0\": container with ID starting with 09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0 not found: ID does not exist" Mar 08 00:32:44.454828 master-0 kubenswrapper[23041]: I0308 00:32:44.454818 23041 scope.go:117] "RemoveContainer" containerID="0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0" Mar 08 00:32:44.454828 master-0 kubenswrapper[23041]: I0308 00:32:44.454823 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455000 master-0 kubenswrapper[23041]: I0308 00:32:44.454851 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455000 master-0 kubenswrapper[23041]: I0308 00:32:44.454904 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455000 master-0 kubenswrapper[23041]: I0308 00:32:44.454934 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455000 master-0 kubenswrapper[23041]: I0308 00:32:44.454955 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqpt\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455000 master-0 kubenswrapper[23041]: I0308 00:32:44.454978 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455028 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455047 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0"} err="failed to get container status \"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0\": rpc error: code = NotFound desc = could not find container \"0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0\": container with ID starting with 0ff7bbc7a985bc357be1e5b9c59697e1d623b2b3f3a45fbbbeea036170f5e8b0 not found: ID does not exist" Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455066 23041 scope.go:117] "RemoveContainer" containerID="1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd" Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455055 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455134 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.455240 master-0 kubenswrapper[23041]: I0308 00:32:44.455218 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"51c724a5-de89-4fde-b596-70157d2d19b6\" (UID: \"51c724a5-de89-4fde-b596-70157d2d19b6\") " Mar 08 00:32:44.456019 master-0 kubenswrapper[23041]: I0308 00:32:44.455745 23041 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.456664 master-0 kubenswrapper[23041]: I0308 00:32:44.456633 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:44.456742 master-0 kubenswrapper[23041]: I0308 00:32:44.456650 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd"} err="failed to get container status \"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd\": rpc error: code = NotFound desc = could not find container \"1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd\": container with ID starting with 1218edf42145af942d644b560a08c25d071edefe5ebdbdbb1dda99cfd07700fd not found: ID does not exist" Mar 08 00:32:44.456742 master-0 kubenswrapper[23041]: I0308 00:32:44.456705 23041 scope.go:117] "RemoveContainer" containerID="09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0" Mar 08 00:32:44.456838 master-0 kubenswrapper[23041]: I0308 00:32:44.456788 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:44.457061 master-0 kubenswrapper[23041]: I0308 00:32:44.457017 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0"} err="failed to get container status \"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0\": rpc error: code = NotFound desc = could not find container \"09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0\": container with ID starting with 09882f77899e1a73f2e7f7b1d393cad387349597cd777096a1f2accf4684e1d0 not found: ID does not exist" Mar 08 00:32:44.457061 master-0 kubenswrapper[23041]: I0308 00:32:44.457055 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.458687 master-0 kubenswrapper[23041]: I0308 00:32:44.458647 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:44.459188 master-0 kubenswrapper[23041]: I0308 00:32:44.459159 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.459313 master-0 kubenswrapper[23041]: I0308 00:32:44.459210 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.459446 master-0 kubenswrapper[23041]: I0308 00:32:44.459413 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out" (OuterVolumeSpecName: "config-out") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:44.459704 master-0 kubenswrapper[23041]: I0308 00:32:44.459655 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.460026 master-0 kubenswrapper[23041]: I0308 00:32:44.459991 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt" (OuterVolumeSpecName: "kube-api-access-wjqpt") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "kube-api-access-wjqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:44.462495 master-0 kubenswrapper[23041]: I0308 00:32:44.462446 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.463635 master-0 kubenswrapper[23041]: I0308 00:32:44.463583 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.511272 master-0 kubenswrapper[23041]: I0308 00:32:44.511198 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.519033 master-0 kubenswrapper[23041]: I0308 00:32:44.518978 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config" (OuterVolumeSpecName: "web-config") pod "51c724a5-de89-4fde-b596-70157d2d19b6" (UID: "51c724a5-de89-4fde-b596-70157d2d19b6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:44.533503 master-0 kubenswrapper[23041]: I0308 00:32:44.533450 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.557954 23041 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558002 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqpt\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-kube-api-access-wjqpt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558026 23041 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558040 23041 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51c724a5-de89-4fde-b596-70157d2d19b6-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558055 23041 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558067 23041 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558081 23041 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558092 23041 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558107 23041 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51c724a5-de89-4fde-b596-70157d2d19b6-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558119 23041 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51c724a5-de89-4fde-b596-70157d2d19b6-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.558362 master-0 kubenswrapper[23041]: I0308 00:32:44.558130 23041 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/51c724a5-de89-4fde-b596-70157d2d19b6-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:44.561184 master-0 kubenswrapper[23041]: I0308 00:32:44.561053 23041 scope.go:117] "RemoveContainer" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.576622 master-0 kubenswrapper[23041]: I0308 00:32:44.576578 23041 scope.go:117] "RemoveContainer" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.589731 master-0 kubenswrapper[23041]: I0308 00:32:44.589683 23041 scope.go:117] "RemoveContainer" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.610950 master-0 kubenswrapper[23041]: I0308 00:32:44.610914 23041 scope.go:117] "RemoveContainer" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.636112 master-0 kubenswrapper[23041]: I0308 00:32:44.635959 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.636528 master-0 kubenswrapper[23041]: E0308 00:32:44.636400 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.636528 master-0 kubenswrapper[23041]: I0308 00:32:44.636430 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} err="failed to get container status \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" Mar 08 00:32:44.636528 master-0 kubenswrapper[23041]: I0308 00:32:44.636482 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.636917 master-0 kubenswrapper[23041]: E0308 00:32:44.636837 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.636917 master-0 kubenswrapper[23041]: I0308 00:32:44.636858 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} err="failed to get container status \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" Mar 08 00:32:44.636917 master-0 kubenswrapper[23041]: I0308 00:32:44.636873 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.637364 master-0 kubenswrapper[23041]: E0308 00:32:44.637279 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.637364 master-0 kubenswrapper[23041]: I0308 00:32:44.637301 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} err="failed to get container status \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" Mar 08 00:32:44.637364 master-0 kubenswrapper[23041]: I0308 00:32:44.637319 23041 scope.go:117] "RemoveContainer" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.637775 master-0 kubenswrapper[23041]: E0308 00:32:44.637683 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": container with ID starting with 6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d not found: ID does not exist" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.637775 master-0 kubenswrapper[23041]: I0308 00:32:44.637704 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} err="failed to get container status \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": rpc error: code = NotFound desc = could not find container \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": container with ID starting with 6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d not found: ID does not exist" Mar 08 00:32:44.637775 master-0 kubenswrapper[23041]: I0308 00:32:44.637717 23041 scope.go:117] "RemoveContainer" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.638077 master-0 kubenswrapper[23041]: E0308 00:32:44.638010 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": container with ID starting with c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b not found: ID does not exist" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.638124 master-0 kubenswrapper[23041]: I0308 00:32:44.638087 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} err="failed to get container status \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": rpc error: code = NotFound desc = could not find container \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": container with ID starting with c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b not found: ID does not exist" Mar 08 00:32:44.638166 master-0 kubenswrapper[23041]: I0308 00:32:44.638127 23041 scope.go:117] "RemoveContainer" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.639066 master-0 kubenswrapper[23041]: E0308 00:32:44.638962 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": container with ID starting with 3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103 not found: ID does not exist" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.639066 master-0 kubenswrapper[23041]: I0308 00:32:44.638985 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} err="failed to get container status \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": rpc error: code = NotFound desc = could not find container \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": container with ID starting with 3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103 not found: ID does not exist" Mar 08 00:32:44.639066 master-0 kubenswrapper[23041]: I0308 00:32:44.639000 23041 scope.go:117] "RemoveContainer" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.639353 master-0 kubenswrapper[23041]: E0308 00:32:44.639322 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": container with ID starting with 85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719 not found: ID does not exist" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.639421 master-0 kubenswrapper[23041]: I0308 00:32:44.639358 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} err="failed to get container status \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": rpc error: code = NotFound desc = could not find container \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": container with ID starting with 85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719 not found: ID does not exist" Mar 08 00:32:44.639421 master-0 kubenswrapper[23041]: I0308 00:32:44.639381 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.639866 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} err="failed to get container status \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.639895 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.640344 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} err="failed to get container status \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.640385 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.640703 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} err="failed to get container status \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.640724 23041 scope.go:117] "RemoveContainer" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.641158 master-0 kubenswrapper[23041]: I0308 00:32:44.641150 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} err="failed to get container status \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": rpc error: code = NotFound desc = could not find container \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": container with ID starting with 6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d not found: ID does not exist" Mar 08 00:32:44.641647 master-0 kubenswrapper[23041]: I0308 00:32:44.641172 23041 scope.go:117] "RemoveContainer" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.641647 master-0 kubenswrapper[23041]: I0308 00:32:44.641567 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} err="failed to get container status \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": rpc error: code = NotFound desc = could not find container \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": container with ID starting with c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b not found: ID does not exist" Mar 08 00:32:44.641647 master-0 kubenswrapper[23041]: I0308 00:32:44.641588 23041 scope.go:117] "RemoveContainer" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.642253 master-0 kubenswrapper[23041]: I0308 00:32:44.641902 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} err="failed to get container status \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": rpc error: code = NotFound desc = could not find container \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": container with ID starting with 3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103 not found: ID does not exist" Mar 08 00:32:44.642253 master-0 kubenswrapper[23041]: I0308 00:32:44.641926 23041 scope.go:117] "RemoveContainer" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.642367 master-0 kubenswrapper[23041]: I0308 00:32:44.642311 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} err="failed to get container status \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": rpc error: code = NotFound desc = could not find container \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": container with ID starting with 85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719 not found: ID does not exist" Mar 08 00:32:44.642367 master-0 kubenswrapper[23041]: I0308 00:32:44.642333 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.642574 master-0 kubenswrapper[23041]: I0308 00:32:44.642544 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} err="failed to get container status \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" Mar 08 00:32:44.642640 master-0 kubenswrapper[23041]: I0308 00:32:44.642623 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.643073 master-0 kubenswrapper[23041]: I0308 00:32:44.643047 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} err="failed to get container status \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" Mar 08 00:32:44.643073 master-0 kubenswrapper[23041]: I0308 00:32:44.643072 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.643556 master-0 kubenswrapper[23041]: I0308 00:32:44.643529 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} err="failed to get container status \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" Mar 08 00:32:44.643556 master-0 kubenswrapper[23041]: I0308 00:32:44.643553 23041 scope.go:117] "RemoveContainer" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.643953 master-0 kubenswrapper[23041]: I0308 00:32:44.643822 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} err="failed to get container status \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": rpc error: code = NotFound desc = could not find container \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": container with ID starting with 6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d not found: ID does not exist" Mar 08 00:32:44.643953 master-0 kubenswrapper[23041]: I0308 00:32:44.643874 23041 scope.go:117] "RemoveContainer" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.644342 master-0 kubenswrapper[23041]: I0308 00:32:44.644185 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} err="failed to get container status \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": rpc error: code = NotFound desc = could not find container \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": container with ID starting with c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b not found: ID does not exist" Mar 08 00:32:44.644342 master-0 kubenswrapper[23041]: I0308 00:32:44.644263 23041 scope.go:117] "RemoveContainer" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.644580 master-0 kubenswrapper[23041]: I0308 00:32:44.644546 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} err="failed to get container status \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": rpc error: code = NotFound desc = could not find container \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": container with ID starting with 3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103 not found: ID does not exist" Mar 08 00:32:44.644580 master-0 kubenswrapper[23041]: I0308 00:32:44.644578 23041 scope.go:117] "RemoveContainer" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.644973 master-0 kubenswrapper[23041]: I0308 00:32:44.644813 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} err="failed to get container status \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": rpc error: code = NotFound desc = could not find container \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": container with ID starting with 85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719 not found: ID does not exist" Mar 08 00:32:44.644973 master-0 kubenswrapper[23041]: I0308 00:32:44.644837 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.645138 master-0 kubenswrapper[23041]: I0308 00:32:44.645113 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} err="failed to get container status \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" Mar 08 00:32:44.645260 master-0 kubenswrapper[23041]: I0308 00:32:44.645137 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.645641 master-0 kubenswrapper[23041]: I0308 00:32:44.645539 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} err="failed to get container status \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" Mar 08 00:32:44.645641 master-0 kubenswrapper[23041]: I0308 00:32:44.645564 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.645858 master-0 kubenswrapper[23041]: I0308 00:32:44.645830 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} err="failed to get container status \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" Mar 08 00:32:44.645858 master-0 kubenswrapper[23041]: I0308 00:32:44.645857 23041 scope.go:117] "RemoveContainer" containerID="6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d" Mar 08 00:32:44.646179 master-0 kubenswrapper[23041]: I0308 00:32:44.646081 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d"} err="failed to get container status \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": rpc error: code = NotFound desc = could not find container \"6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d\": container with ID starting with 6b4cdbb2f93609d97943c84882bcd164dd02ff5a4b44c594a695ac0540ea519d not found: ID does not exist" Mar 08 00:32:44.646179 master-0 kubenswrapper[23041]: I0308 00:32:44.646107 23041 scope.go:117] "RemoveContainer" containerID="c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b" Mar 08 00:32:44.646414 master-0 kubenswrapper[23041]: I0308 00:32:44.646384 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b"} err="failed to get container status \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": rpc error: code = NotFound desc = could not find container \"c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b\": container with ID starting with c66564a5256e588a016eb2dcc31a2ec52ebbfbabd58f97c462377ecf84dc302b not found: ID does not exist" Mar 08 00:32:44.646471 master-0 kubenswrapper[23041]: I0308 00:32:44.646412 23041 scope.go:117] "RemoveContainer" containerID="3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103" Mar 08 00:32:44.646677 master-0 kubenswrapper[23041]: I0308 00:32:44.646651 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103"} err="failed to get container status \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": rpc error: code = NotFound desc = could not find container \"3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103\": container with ID starting with 3935ac49d115df026834e687281350cf07c7cb60c028378e1b0733fb958a3103 not found: ID does not exist" Mar 08 00:32:44.646721 master-0 kubenswrapper[23041]: I0308 00:32:44.646674 23041 scope.go:117] "RemoveContainer" containerID="85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719" Mar 08 00:32:44.646919 master-0 kubenswrapper[23041]: I0308 00:32:44.646893 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719"} err="failed to get container status \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": rpc error: code = NotFound desc = could not find container \"85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719\": container with ID starting with 85445a369f529f5269a8863acca07ead4a125225ffabe9eb1f61166cb663a719 not found: ID does not exist" Mar 08 00:32:44.646967 master-0 kubenswrapper[23041]: I0308 00:32:44.646918 23041 scope.go:117] "RemoveContainer" containerID="6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493" Mar 08 00:32:44.666533 master-0 kubenswrapper[23041]: I0308 00:32:44.666486 23041 scope.go:117] "RemoveContainer" containerID="6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493" Mar 08 00:32:44.667996 master-0 kubenswrapper[23041]: E0308 00:32:44.667911 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493\": container with ID starting with 6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493 not found: ID does not exist" containerID="6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493" Mar 08 00:32:44.668051 master-0 kubenswrapper[23041]: I0308 00:32:44.667994 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493"} err="failed to get container status \"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493\": rpc error: code = NotFound desc = could not find container \"6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493\": container with ID starting with 6ba57f2db9726259ae712e088285735c826dee0e1e18c3454e04e70e18447493 not found: ID does not exist" Mar 08 00:32:44.668051 master-0 kubenswrapper[23041]: I0308 00:32:44.668047 23041 scope.go:117] "RemoveContainer" containerID="77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7" Mar 08 00:32:44.668451 master-0 kubenswrapper[23041]: I0308 00:32:44.668389 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7"} err="failed to get container status \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": rpc error: code = NotFound desc = could not find container \"77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7\": container with ID starting with 77c439a84aa164ee07c9e5c171aea2eb6918060c5491a7828582c33b62bf08b7 not found: ID does not exist" Mar 08 00:32:44.668451 master-0 kubenswrapper[23041]: I0308 00:32:44.668443 23041 scope.go:117] "RemoveContainer" containerID="3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24" Mar 08 00:32:44.668884 master-0 kubenswrapper[23041]: I0308 00:32:44.668828 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24"} err="failed to get container status \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": rpc error: code = NotFound desc = could not find container \"3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24\": container with ID starting with 3aa4df6d988af0a0fbf0ca91c3c8199b19f529f21f24176f890bfab0c3b25b24 not found: ID does not exist" Mar 08 00:32:44.668884 master-0 kubenswrapper[23041]: I0308 00:32:44.668878 23041 scope.go:117] "RemoveContainer" containerID="4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2" Mar 08 00:32:44.669215 master-0 kubenswrapper[23041]: I0308 00:32:44.669146 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2"} err="failed to get container status \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": rpc error: code = NotFound desc = could not find container \"4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2\": container with ID starting with 4c868aa7418a56f30943c51f9eac6d8da436a1691ad5c481be6a7490d78054a2 not found: ID does not exist" Mar 08 00:32:44.995170 master-0 kubenswrapper[23041]: I0308 00:32:44.995118 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"68d2d556561da5978f104c45f8ecd8a09a79c04161083561a003cabefd7b6ac9"} Mar 08 00:32:45.001220 master-0 kubenswrapper[23041]: I0308 00:32:45.001172 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"51c724a5-de89-4fde-b596-70157d2d19b6","Type":"ContainerDied","Data":"c8e2a3844909fda8d886c3f9a2f45898e07fcb80d21afe699dd2bc976f511106"} Mar 08 00:32:45.002103 master-0 kubenswrapper[23041]: I0308 00:32:45.002014 23041 patch_prober.go:28] interesting pod/downloads-84f57b9877-8g27w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 08 00:32:45.002103 master-0 kubenswrapper[23041]: I0308 00:32:45.002055 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-8g27w" podUID="414dbe5d-16a5-4765-9dc5-d50c0784ace7" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 08 00:32:45.002573 master-0 kubenswrapper[23041]: I0308 00:32:45.002503 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:32:45.385702 master-0 kubenswrapper[23041]: I0308 00:32:45.385622 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-8g27w" podStartSLOduration=4.643342277 podStartE2EDuration="43.385595985s" podCreationTimestamp="2026-03-08 00:32:02 +0000 UTC" firstStartedPulling="2026-03-08 00:32:03.498995475 +0000 UTC m=+28.971832029" lastFinishedPulling="2026-03-08 00:32:42.241249183 +0000 UTC m=+67.714085737" observedRunningTime="2026-03-08 00:32:45.382461147 +0000 UTC m=+70.855297711" watchObservedRunningTime="2026-03-08 00:32:45.385595985 +0000 UTC m=+70.858432549" Mar 08 00:32:45.465649 master-0 kubenswrapper[23041]: I0308 00:32:45.465599 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:32:45.574512 master-0 kubenswrapper[23041]: I0308 00:32:45.574442 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock" (OuterVolumeSpecName: "var-lock") pod "48fcdccb-478e-4027-b4b9-9a061439f0e2" (UID: "48fcdccb-478e-4027-b4b9-9a061439f0e2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:45.574814 master-0 kubenswrapper[23041]: I0308 00:32:45.574741 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock\") pod \"48fcdccb-478e-4027-b4b9-9a061439f0e2\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " Mar 08 00:32:45.574981 master-0 kubenswrapper[23041]: I0308 00:32:45.574959 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access\") pod \"48fcdccb-478e-4027-b4b9-9a061439f0e2\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " Mar 08 00:32:45.575497 master-0 kubenswrapper[23041]: I0308 00:32:45.575042 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir\") pod \"48fcdccb-478e-4027-b4b9-9a061439f0e2\" (UID: \"48fcdccb-478e-4027-b4b9-9a061439f0e2\") " Mar 08 00:32:45.575497 master-0 kubenswrapper[23041]: I0308 00:32:45.575242 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48fcdccb-478e-4027-b4b9-9a061439f0e2" (UID: "48fcdccb-478e-4027-b4b9-9a061439f0e2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:45.575691 master-0 kubenswrapper[23041]: I0308 00:32:45.575658 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:45.575755 master-0 kubenswrapper[23041]: I0308 00:32:45.575694 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48fcdccb-478e-4027-b4b9-9a061439f0e2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:45.578646 master-0 kubenswrapper[23041]: I0308 00:32:45.578558 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48fcdccb-478e-4027-b4b9-9a061439f0e2" (UID: "48fcdccb-478e-4027-b4b9-9a061439f0e2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:45.678012 master-0 kubenswrapper[23041]: I0308 00:32:45.677644 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fcdccb-478e-4027-b4b9-9a061439f0e2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:32:45.810968 master-0 kubenswrapper[23041]: I0308 00:32:45.804277 23041 patch_prober.go:28] interesting pod/console-5c84b9c874-8xl2l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 08 00:32:45.810968 master-0 kubenswrapper[23041]: I0308 00:32:45.804336 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 08 00:32:45.812082 master-0 kubenswrapper[23041]: I0308 00:32:45.811645 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj"] Mar 08 00:32:45.846617 master-0 kubenswrapper[23041]: W0308 00:32:45.846568 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432f3e5b_122a_44f1_91b9_7c1399927a7a.slice/crio-b22451207887695f2373aae32ca873a80c787d0381a6db953a68005425e6e50e WatchSource:0}: Error finding container b22451207887695f2373aae32ca873a80c787d0381a6db953a68005425e6e50e: Status 404 returned error can't find the container with id b22451207887695f2373aae32ca873a80c787d0381a6db953a68005425e6e50e Mar 08 00:32:46.009218 master-0 kubenswrapper[23041]: I0308 00:32:46.009165 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"48fcdccb-478e-4027-b4b9-9a061439f0e2","Type":"ContainerDied","Data":"9048ec3f692e19da41bf1e5b754fc6525e0e4cd99b83b30eb7a20ba35e882ab7"} Mar 08 00:32:46.009309 master-0 kubenswrapper[23041]: I0308 00:32:46.009226 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9048ec3f692e19da41bf1e5b754fc6525e0e4cd99b83b30eb7a20ba35e882ab7" Mar 08 00:32:46.009309 master-0 kubenswrapper[23041]: I0308 00:32:46.009282 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 00:32:46.015691 master-0 kubenswrapper[23041]: I0308 00:32:46.015650 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" event={"ID":"432f3e5b-122a-44f1-91b9-7c1399927a7a","Type":"ContainerStarted","Data":"b22451207887695f2373aae32ca873a80c787d0381a6db953a68005425e6e50e"} Mar 08 00:32:46.019258 master-0 kubenswrapper[23041]: I0308 00:32:46.018630 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454"} Mar 08 00:32:46.730744 master-0 kubenswrapper[23041]: I0308 00:32:46.730684 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:46.841586 master-0 kubenswrapper[23041]: I0308 00:32:46.841011 23041 patch_prober.go:28] interesting pod/console-76c777474b-n9mhf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 08 00:32:46.841586 master-0 kubenswrapper[23041]: I0308 00:32:46.841096 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76c777474b-n9mhf" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 08 00:32:47.025862 master-0 kubenswrapper[23041]: I0308 00:32:47.025810 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" event={"ID":"432f3e5b-122a-44f1-91b9-7c1399927a7a","Type":"ContainerStarted","Data":"39b5ee2e512a40443ca85e8b84308ecd0dcf4a6d9807eb7bd908aaee43723c8d"} Mar 08 00:32:47.025862 master-0 kubenswrapper[23041]: I0308 00:32:47.025866 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:47.027907 master-0 kubenswrapper[23041]: I0308 00:32:47.027871 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36"} Mar 08 00:32:48.026531 master-0 kubenswrapper[23041]: I0308 00:32:48.026422 23041 patch_prober.go:28] interesting pod/oauth-openshift-5b6fc868c6-zc2fj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:32:48.027426 master-0 kubenswrapper[23041]: I0308 00:32:48.027315 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" podUID="432f3e5b-122a-44f1-91b9-7c1399927a7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:32:48.199674 master-0 kubenswrapper[23041]: I0308 00:32:48.199611 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-6df5fc69d-thc6n"] Mar 08 00:32:48.826245 master-0 kubenswrapper[23041]: I0308 00:32:48.826099 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f437239-11ff-4333-bdea-8a48b8ac95e8" path="/var/lib/kubelet/pods/1f437239-11ff-4333-bdea-8a48b8ac95e8/volumes" Mar 08 00:32:49.037330 master-0 kubenswrapper[23041]: I0308 00:32:49.037226 23041 patch_prober.go:28] interesting pod/oauth-openshift-5b6fc868c6-zc2fj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:32:49.038026 master-0 kubenswrapper[23041]: I0308 00:32:49.037342 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" podUID="432f3e5b-122a-44f1-91b9-7c1399927a7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:32:50.482935 master-0 kubenswrapper[23041]: I0308 00:32:50.481979 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:32:51.134241 master-0 kubenswrapper[23041]: I0308 00:32:51.132691 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:32:51.134241 master-0 kubenswrapper[23041]: I0308 00:32:51.132917 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-3-master-0" podUID="8b0395d1-7cb0-4857-891a-68f88a6fd468" containerName="installer" containerID="cri-o://4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888" gracePeriod=30 Mar 08 00:32:52.341287 master-0 kubenswrapper[23041]: I0308 00:32:52.339040 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 00:32:52.467066 master-0 kubenswrapper[23041]: I0308 00:32:52.466990 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=19.466973968 podStartE2EDuration="19.466973968s" podCreationTimestamp="2026-03-08 00:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:52.416374642 +0000 UTC m=+77.889211226" watchObservedRunningTime="2026-03-08 00:32:52.466973968 +0000 UTC m=+77.939810522" Mar 08 00:32:52.537941 master-0 kubenswrapper[23041]: I0308 00:32:52.537901 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:32:52.556750 master-0 kubenswrapper[23041]: I0308 00:32:52.548646 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:32:52.586194 master-0 kubenswrapper[23041]: I0308 00:32:52.584185 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" podStartSLOduration=29.584161331 podStartE2EDuration="29.584161331s" podCreationTimestamp="2026-03-08 00:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:52.573918145 +0000 UTC m=+78.046754699" watchObservedRunningTime="2026-03-08 00:32:52.584161331 +0000 UTC m=+78.056997885" Mar 08 00:32:52.684839 master-0 kubenswrapper[23041]: I0308 00:32:52.682520 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-8g27w" Mar 08 00:32:52.694843 master-0 kubenswrapper[23041]: I0308 00:32:52.693673 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.694843 master-0 kubenswrapper[23041]: I0308 00:32:52.693715 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.694843 master-0 kubenswrapper[23041]: I0308 00:32:52.693727 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.694843 master-0 kubenswrapper[23041]: I0308 00:32:52.693738 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.699282 master-0 kubenswrapper[23041]: I0308 00:32:52.698851 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.709350 master-0 kubenswrapper[23041]: I0308 00:32:52.709334 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:52.718078 master-0 kubenswrapper[23041]: I0308 00:32:52.718037 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=10.718024801 podStartE2EDuration="10.718024801s" podCreationTimestamp="2026-03-08 00:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:52.710480002 +0000 UTC m=+78.183316566" watchObservedRunningTime="2026-03-08 00:32:52.718024801 +0000 UTC m=+78.190861355" Mar 08 00:32:52.815285 master-0 kubenswrapper[23041]: I0308 00:32:52.815235 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0133db83-1083-4458-86d4-49e431dd4365" path="/var/lib/kubelet/pods/0133db83-1083-4458-86d4-49e431dd4365/volumes" Mar 08 00:32:52.815837 master-0 kubenswrapper[23041]: I0308 00:32:52.815812 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" path="/var/lib/kubelet/pods/51c724a5-de89-4fde-b596-70157d2d19b6/volumes" Mar 08 00:32:53.089231 master-0 kubenswrapper[23041]: I0308 00:32:53.089162 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:53.527623 master-0 kubenswrapper[23041]: I0308 00:32:53.527572 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5b6fc868c6-zc2fj" Mar 08 00:32:54.095757 master-0 kubenswrapper[23041]: I0308 00:32:54.095666 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:32:54.654825 master-0 kubenswrapper[23041]: I0308 00:32:54.654738 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:32:54.655407 master-0 kubenswrapper[23041]: E0308 00:32:54.655254 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fcdccb-478e-4027-b4b9-9a061439f0e2" containerName="installer" Mar 08 00:32:54.655407 master-0 kubenswrapper[23041]: I0308 00:32:54.655279 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcdccb-478e-4027-b4b9-9a061439f0e2" containerName="installer" Mar 08 00:32:54.655407 master-0 kubenswrapper[23041]: E0308 00:32:54.655324 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="config-reloader" Mar 08 00:32:54.655407 master-0 kubenswrapper[23041]: I0308 00:32:54.655336 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="config-reloader" Mar 08 00:32:54.655407 master-0 kubenswrapper[23041]: E0308 00:32:54.655405 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="prom-label-proxy" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: I0308 00:32:54.655420 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="prom-label-proxy" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: E0308 00:32:54.655462 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="alertmanager" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: I0308 00:32:54.655476 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="alertmanager" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: E0308 00:32:54.655510 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-web" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: I0308 00:32:54.655522 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-web" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: E0308 00:32:54.655546 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy" Mar 08 00:32:54.655569 master-0 kubenswrapper[23041]: I0308 00:32:54.655558 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: E0308 00:32:54.655595 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="init-config-reloader" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: I0308 00:32:54.655610 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="init-config-reloader" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: E0308 00:32:54.655635 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-metric" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: I0308 00:32:54.655647 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-metric" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: E0308 00:32:54.655672 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0133db83-1083-4458-86d4-49e431dd4365" containerName="installer" Mar 08 00:32:54.655757 master-0 kubenswrapper[23041]: I0308 00:32:54.655684 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0133db83-1083-4458-86d4-49e431dd4365" containerName="installer" Mar 08 00:32:54.655976 master-0 kubenswrapper[23041]: I0308 00:32:54.655935 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="alertmanager" Mar 08 00:32:54.656020 master-0 kubenswrapper[23041]: I0308 00:32:54.655975 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-metric" Mar 08 00:32:54.656020 master-0 kubenswrapper[23041]: I0308 00:32:54.656008 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="config-reloader" Mar 08 00:32:54.656083 master-0 kubenswrapper[23041]: I0308 00:32:54.656054 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy" Mar 08 00:32:54.656117 master-0 kubenswrapper[23041]: I0308 00:32:54.656084 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fcdccb-478e-4027-b4b9-9a061439f0e2" containerName="installer" Mar 08 00:32:54.656150 master-0 kubenswrapper[23041]: I0308 00:32:54.656119 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0133db83-1083-4458-86d4-49e431dd4365" containerName="installer" Mar 08 00:32:54.656184 master-0 kubenswrapper[23041]: I0308 00:32:54.656147 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="kube-rbac-proxy-web" Mar 08 00:32:54.656184 master-0 kubenswrapper[23041]: I0308 00:32:54.656173 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c724a5-de89-4fde-b596-70157d2d19b6" containerName="prom-label-proxy" Mar 08 00:32:54.661278 master-0 kubenswrapper[23041]: I0308 00:32:54.657279 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.668694 master-0 kubenswrapper[23041]: I0308 00:32:54.668619 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.668919 master-0 kubenswrapper[23041]: I0308 00:32:54.668711 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.668919 master-0 kubenswrapper[23041]: I0308 00:32:54.668763 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.671562 master-0 kubenswrapper[23041]: I0308 00:32:54.671511 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:32:54.770422 master-0 kubenswrapper[23041]: I0308 00:32:54.770362 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.770895 master-0 kubenswrapper[23041]: I0308 00:32:54.770408 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.770895 master-0 kubenswrapper[23041]: I0308 00:32:54.770483 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.770895 master-0 kubenswrapper[23041]: I0308 00:32:54.770625 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.770895 master-0 kubenswrapper[23041]: I0308 00:32:54.770667 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:54.791238 master-0 kubenswrapper[23041]: I0308 00:32:54.790871 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access\") pod \"installer-4-master-0\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:55.001906 master-0 kubenswrapper[23041]: I0308 00:32:55.001854 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:32:55.433327 master-0 kubenswrapper[23041]: I0308 00:32:55.433185 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:32:55.439090 master-0 kubenswrapper[23041]: W0308 00:32:55.439042 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod74512190_22e4_4648_8d1e_e487de48a124.slice/crio-01f93899a9cab6aa4c9b39d2d34505f41a3d828df7d28f0bfd223bbff7cde117 WatchSource:0}: Error finding container 01f93899a9cab6aa4c9b39d2d34505f41a3d828df7d28f0bfd223bbff7cde117: Status 404 returned error can't find the container with id 01f93899a9cab6aa4c9b39d2d34505f41a3d828df7d28f0bfd223bbff7cde117 Mar 08 00:32:55.803902 master-0 kubenswrapper[23041]: I0308 00:32:55.803832 23041 patch_prober.go:28] interesting pod/console-5c84b9c874-8xl2l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 08 00:32:55.804427 master-0 kubenswrapper[23041]: I0308 00:32:55.803899 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 08 00:32:56.106652 master-0 kubenswrapper[23041]: I0308 00:32:56.106532 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"74512190-22e4-4648-8d1e-e487de48a124","Type":"ContainerStarted","Data":"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794"} Mar 08 00:32:56.106652 master-0 kubenswrapper[23041]: I0308 00:32:56.106583 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"74512190-22e4-4648-8d1e-e487de48a124","Type":"ContainerStarted","Data":"01f93899a9cab6aa4c9b39d2d34505f41a3d828df7d28f0bfd223bbff7cde117"} Mar 08 00:32:56.849732 master-0 kubenswrapper[23041]: I0308 00:32:56.839005 23041 patch_prober.go:28] interesting pod/console-76c777474b-n9mhf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 08 00:32:56.849732 master-0 kubenswrapper[23041]: I0308 00:32:56.839139 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76c777474b-n9mhf" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 08 00:32:56.849732 master-0 kubenswrapper[23041]: I0308 00:32:56.840424 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.84039674 podStartE2EDuration="2.84039674s" podCreationTimestamp="2026-03-08 00:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:56.837144129 +0000 UTC m=+82.309980763" watchObservedRunningTime="2026-03-08 00:32:56.84039674 +0000 UTC m=+82.313233314" Mar 08 00:32:57.216747 master-0 kubenswrapper[23041]: I0308 00:32:57.216688 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:32:57.217049 master-0 kubenswrapper[23041]: E0308 00:32:57.216914 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:32:57.217049 master-0 kubenswrapper[23041]: E0308 00:32:57.216994 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:34:01.216974905 +0000 UTC m=+146.689811459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:33:04.039793 master-0 kubenswrapper[23041]: I0308 00:33:04.039727 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ttpzw"] Mar 08 00:33:04.042214 master-0 kubenswrapper[23041]: I0308 00:33:04.040918 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.042890 master-0 kubenswrapper[23041]: I0308 00:33:04.042853 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-867kk" Mar 08 00:33:04.044715 master-0 kubenswrapper[23041]: I0308 00:33:04.044682 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:33:04.087229 master-0 kubenswrapper[23041]: I0308 00:33:04.086075 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:04.094765 master-0 kubenswrapper[23041]: I0308 00:33:04.094574 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="prometheus" containerID="cri-o://4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" gracePeriod=600 Mar 08 00:33:04.094765 master-0 kubenswrapper[23041]: I0308 00:33:04.094752 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" gracePeriod=600 Mar 08 00:33:04.095019 master-0 kubenswrapper[23041]: I0308 00:33:04.094787 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy" containerID="cri-o://dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" gracePeriod=600 Mar 08 00:33:04.095019 master-0 kubenswrapper[23041]: I0308 00:33:04.094822 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-web" containerID="cri-o://4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" gracePeriod=600 Mar 08 00:33:04.095019 master-0 kubenswrapper[23041]: I0308 00:33:04.094860 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="thanos-sidecar" containerID="cri-o://1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" gracePeriod=600 Mar 08 00:33:04.095019 master-0 kubenswrapper[23041]: I0308 00:33:04.094887 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="config-reloader" containerID="cri-o://8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" gracePeriod=600 Mar 08 00:33:04.118966 master-0 kubenswrapper[23041]: I0308 00:33:04.118455 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:33:04.136294 master-0 kubenswrapper[23041]: I0308 00:33:04.132626 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.141716 master-0 kubenswrapper[23041]: I0308 00:33:04.140819 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 00:33:04.141716 master-0 kubenswrapper[23041]: I0308 00:33:04.141194 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 00:33:04.141716 master-0 kubenswrapper[23041]: I0308 00:33:04.141380 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 00:33:04.141716 master-0 kubenswrapper[23041]: I0308 00:33:04.141496 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 00:33:04.141716 master-0 kubenswrapper[23041]: I0308 00:33:04.141626 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 00:33:04.148788 master-0 kubenswrapper[23041]: I0308 00:33:04.148737 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:33:04.149895 master-0 kubenswrapper[23041]: I0308 00:33:04.149872 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 00:33:04.150156 master-0 kubenswrapper[23041]: I0308 00:33:04.150110 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 00:33:04.150827 master-0 kubenswrapper[23041]: I0308 00:33:04.150786 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1c5c08-f159-4f15-8847-d39477b3c6e0-host\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.150916 master-0 kubenswrapper[23041]: I0308 00:33:04.150897 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pg6\" (UniqueName: \"kubernetes.io/projected/ab1c5c08-f159-4f15-8847-d39477b3c6e0-kube-api-access-g4pg6\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.158420 master-0 kubenswrapper[23041]: I0308 00:33:04.158374 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab1c5c08-f159-4f15-8847-d39477b3c6e0-serviceca\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.160481 master-0 kubenswrapper[23041]: I0308 00:33:04.160439 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 00:33:04.162732 master-0 kubenswrapper[23041]: I0308 00:33:04.162424 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-fsd5q" Mar 08 00:33:04.170668 master-0 kubenswrapper[23041]: I0308 00:33:04.170609 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:33:04.259723 master-0 kubenswrapper[23041]: I0308 00:33:04.259673 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab1c5c08-f159-4f15-8847-d39477b3c6e0-serviceca\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.259801 master-0 kubenswrapper[23041]: I0308 00:33:04.259773 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1c5c08-f159-4f15-8847-d39477b3c6e0-host\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.259833 master-0 kubenswrapper[23041]: I0308 00:33:04.259821 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259864 master-0 kubenswrapper[23041]: I0308 00:33:04.259845 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pg6\" (UniqueName: \"kubernetes.io/projected/ab1c5c08-f159-4f15-8847-d39477b3c6e0-kube-api-access-g4pg6\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.259897 master-0 kubenswrapper[23041]: I0308 00:33:04.259878 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjz5\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-kube-api-access-xzjz5\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259932 master-0 kubenswrapper[23041]: I0308 00:33:04.259905 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259932 master-0 kubenswrapper[23041]: I0308 00:33:04.259927 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259991 master-0 kubenswrapper[23041]: I0308 00:33:04.259950 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259991 master-0 kubenswrapper[23041]: I0308 00:33:04.259968 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.259991 master-0 kubenswrapper[23041]: I0308 00:33:04.259989 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260121 master-0 kubenswrapper[23041]: I0308 00:33:04.260003 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-web-config\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260121 master-0 kubenswrapper[23041]: I0308 00:33:04.260020 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260121 master-0 kubenswrapper[23041]: I0308 00:33:04.260040 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-config-out\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260121 master-0 kubenswrapper[23041]: I0308 00:33:04.260058 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260121 master-0 kubenswrapper[23041]: I0308 00:33:04.260077 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.260665 master-0 kubenswrapper[23041]: I0308 00:33:04.260641 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab1c5c08-f159-4f15-8847-d39477b3c6e0-serviceca\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.260720 master-0 kubenswrapper[23041]: I0308 00:33:04.260671 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1c5c08-f159-4f15-8847-d39477b3c6e0-host\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.286041 master-0 kubenswrapper[23041]: I0308 00:33:04.285629 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:04.292292 master-0 kubenswrapper[23041]: I0308 00:33:04.286457 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.300617 master-0 kubenswrapper[23041]: I0308 00:33:04.299745 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:04.326060 master-0 kubenswrapper[23041]: I0308 00:33:04.324750 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pg6\" (UniqueName: \"kubernetes.io/projected/ab1c5c08-f159-4f15-8847-d39477b3c6e0-kube-api-access-g4pg6\") pod \"node-ca-ttpzw\" (UID: \"ab1c5c08-f159-4f15-8847-d39477b3c6e0\") " pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361098 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361163 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjz5\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-kube-api-access-xzjz5\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361187 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361223 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361248 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361267 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361288 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361304 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-web-config\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361322 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361342 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-config-out\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361359 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361379 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.378114 master-0 kubenswrapper[23041]: I0308 00:33:04.361946 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.381612 master-0 kubenswrapper[23041]: I0308 00:33:04.381570 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.382957 master-0 kubenswrapper[23041]: I0308 00:33:04.382918 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.389456 master-0 kubenswrapper[23041]: I0308 00:33:04.387726 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588470ab-8c2f-4769-a09e-462b07c592fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.391892 master-0 kubenswrapper[23041]: I0308 00:33:04.391853 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.392055 master-0 kubenswrapper[23041]: I0308 00:33:04.391929 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.395333 master-0 kubenswrapper[23041]: I0308 00:33:04.394519 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/588470ab-8c2f-4769-a09e-462b07c592fa-config-out\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.396240 master-0 kubenswrapper[23041]: I0308 00:33:04.396193 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.408641 master-0 kubenswrapper[23041]: I0308 00:33:04.407421 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttpzw" Mar 08 00:33:04.410699 master-0 kubenswrapper[23041]: I0308 00:33:04.410454 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.410796 master-0 kubenswrapper[23041]: I0308 00:33:04.410770 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.433171 master-0 kubenswrapper[23041]: I0308 00:33:04.423654 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/588470ab-8c2f-4769-a09e-462b07c592fa-web-config\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.433171 master-0 kubenswrapper[23041]: I0308 00:33:04.425026 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjz5\" (UniqueName: \"kubernetes.io/projected/588470ab-8c2f-4769-a09e-462b07c592fa-kube-api-access-xzjz5\") pod \"alertmanager-main-0\" (UID: \"588470ab-8c2f-4769-a09e-462b07c592fa\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465741 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465794 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465845 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465882 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwr2\" (UniqueName: \"kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465911 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465940 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.465986 master-0 kubenswrapper[23041]: I0308 00:33:04.465961 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578354 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578534 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578651 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwr2\" (UniqueName: \"kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578703 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578756 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.578804 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580442 master-0 kubenswrapper[23041]: I0308 00:33:04.579192 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.580887 master-0 kubenswrapper[23041]: I0308 00:33:04.580741 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.583265 master-0 kubenswrapper[23041]: I0308 00:33:04.582279 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.583662 master-0 kubenswrapper[23041]: I0308 00:33:04.583612 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.584630 master-0 kubenswrapper[23041]: I0308 00:33:04.584598 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.586025 master-0 kubenswrapper[23041]: I0308 00:33:04.585998 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.586555 master-0 kubenswrapper[23041]: I0308 00:33:04.586519 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.608228 master-0 kubenswrapper[23041]: I0308 00:33:04.607399 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwr2\" (UniqueName: \"kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2\") pod \"console-6787d8db86-xxqwp\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.712416 master-0 kubenswrapper[23041]: I0308 00:33:04.712015 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 00:33:04.765363 master-0 kubenswrapper[23041]: I0308 00:33:04.765311 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:04.792953 master-0 kubenswrapper[23041]: I0308 00:33:04.792903 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:04.883353 master-0 kubenswrapper[23041]: I0308 00:33:04.883174 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.883353 master-0 kubenswrapper[23041]: I0308 00:33:04.883243 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.888892 master-0 kubenswrapper[23041]: I0308 00:33:04.887312 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900433 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900481 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900510 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900534 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900616 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900655 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900702 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900752 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900790 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900815 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900851 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl98m\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900878 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900918 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900949 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.900975 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.901013 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config\") pod \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\" (UID: \"4b9b4180-fc41-4072-9c61-0a35390a7ff3\") " Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.901689 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:04.902760 master-0 kubenswrapper[23041]: I0308 00:33:04.902809 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:04.914018 master-0 kubenswrapper[23041]: I0308 00:33:04.909077 23041 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:04.914018 master-0 kubenswrapper[23041]: I0308 00:33:04.909118 23041 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:04.914018 master-0 kubenswrapper[23041]: I0308 00:33:04.909130 23041 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:04.914018 master-0 kubenswrapper[23041]: I0308 00:33:04.913596 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.917476 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.917873 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config" (OuterVolumeSpecName: "config") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.919443 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.927958 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.928766 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.935529 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.936857 master-0 kubenswrapper[23041]: I0308 00:33:04.935648 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.944261 master-0 kubenswrapper[23041]: I0308 00:33:04.940949 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:04.944261 master-0 kubenswrapper[23041]: I0308 00:33:04.943177 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.944261 master-0 kubenswrapper[23041]: I0308 00:33:04.943414 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out" (OuterVolumeSpecName: "config-out") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04.944261 master-0 kubenswrapper[23041]: I0308 00:33:04.943755 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.944261 master-0 kubenswrapper[23041]: I0308 00:33:04.943981 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:04.950633 master-0 kubenswrapper[23041]: I0308 00:33:04.950416 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m" (OuterVolumeSpecName: "kube-api-access-fl98m") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "kube-api-access-fl98m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:04.982032 master-0 kubenswrapper[23041]: I0308 00:33:04.981947 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config" (OuterVolumeSpecName: "web-config") pod "4b9b4180-fc41-4072-9c61-0a35390a7ff3" (UID: "4b9b4180-fc41-4072-9c61-0a35390a7ff3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012294 23041 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012346 23041 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012364 23041 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012375 23041 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012389 23041 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012403 23041 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012416 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl98m\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-kube-api-access-fl98m\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012440 23041 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012455 23041 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012468 23041 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012479 23041 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9b4180-fc41-4072-9c61-0a35390a7ff3-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012489 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012498 23041 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b9b4180-fc41-4072-9c61-0a35390a7ff3-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012507 23041 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9b4180-fc41-4072-9c61-0a35390a7ff3-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.015540 master-0 kubenswrapper[23041]: I0308 00:33:05.012516 23041 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9b4180-fc41-4072-9c61-0a35390a7ff3-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:05.205232 master-0 kubenswrapper[23041]: I0308 00:33:05.199745 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215060 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" exitCode=0 Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215099 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" exitCode=0 Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215110 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" exitCode=0 Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215118 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" exitCode=0 Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215126 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" exitCode=0 Mar 08 00:33:05.215124 master-0 kubenswrapper[23041]: I0308 00:33:05.215134 23041 generic.go:334] "Generic (PLEG): container finished" podID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" exitCode=0 Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215192 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215274 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215291 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215304 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215316 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215327 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215337 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4b9b4180-fc41-4072-9c61-0a35390a7ff3","Type":"ContainerDied","Data":"39df2361ce85bd74018d4b51eb5800f820b46418602e00b8ff5ae478baf2f2d2"} Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215357 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.215609 master-0 kubenswrapper[23041]: I0308 00:33:05.215513 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.226266 master-0 kubenswrapper[23041]: I0308 00:33:05.226094 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttpzw" event={"ID":"ab1c5c08-f159-4f15-8847-d39477b3c6e0","Type":"ContainerStarted","Data":"6d039993b13eeb95aaa3cf81af152f202def690dfb9d90e34e0f464dcadc212f"} Mar 08 00:33:05.254691 master-0 kubenswrapper[23041]: I0308 00:33:05.254373 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 00:33:05.265273 master-0 kubenswrapper[23041]: I0308 00:33:05.264902 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.265377 master-0 kubenswrapper[23041]: I0308 00:33:05.265345 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280594 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-web" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280649 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-web" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280683 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="thanos-sidecar" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280692 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="thanos-sidecar" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280707 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="prometheus" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280715 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="prometheus" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280738 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="init-config-reloader" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280744 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="init-config-reloader" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280758 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="config-reloader" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280765 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="config-reloader" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280779 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280784 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: E0308 00:33:05.280805 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-thanos" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.280813 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-thanos" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281094 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="thanos-sidecar" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281126 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-web" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281151 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy-thanos" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281172 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="kube-rbac-proxy" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281196 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="prometheus" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281236 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" containerName="config-reloader" Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281953 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.281981 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:05.284849 master-0 kubenswrapper[23041]: I0308 00:33:05.282134 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.288040 master-0 kubenswrapper[23041]: I0308 00:33:05.285959 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:05.321276 master-0 kubenswrapper[23041]: I0308 00:33:05.320773 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:05.322584 master-0 kubenswrapper[23041]: I0308 00:33:05.322547 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322642 master-0 kubenswrapper[23041]: I0308 00:33:05.322582 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322642 master-0 kubenswrapper[23041]: I0308 00:33:05.322599 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322642 master-0 kubenswrapper[23041]: I0308 00:33:05.322619 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322748 master-0 kubenswrapper[23041]: I0308 00:33:05.322690 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdwt\" (UniqueName: \"kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322748 master-0 kubenswrapper[23041]: I0308 00:33:05.322713 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.322748 master-0 kubenswrapper[23041]: I0308 00:33:05.322739 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.339723 master-0 kubenswrapper[23041]: I0308 00:33:05.331130 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:05.345016 master-0 kubenswrapper[23041]: I0308 00:33:05.341476 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.345438 master-0 kubenswrapper[23041]: I0308 00:33:05.345406 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-48mqvdnajl6js" Mar 08 00:33:05.345599 master-0 kubenswrapper[23041]: I0308 00:33:05.345581 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 00:33:05.346012 master-0 kubenswrapper[23041]: I0308 00:33:05.345915 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 00:33:05.352177 master-0 kubenswrapper[23041]: I0308 00:33:05.351470 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 00:33:05.352177 master-0 kubenswrapper[23041]: I0308 00:33:05.351630 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 00:33:05.352177 master-0 kubenswrapper[23041]: I0308 00:33:05.351747 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 00:33:05.352359 master-0 kubenswrapper[23041]: I0308 00:33:05.352281 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 00:33:05.353971 master-0 kubenswrapper[23041]: I0308 00:33:05.353954 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 00:33:05.354133 master-0 kubenswrapper[23041]: I0308 00:33:05.354086 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 00:33:05.354133 master-0 kubenswrapper[23041]: I0308 00:33:05.354000 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-68jqh" Mar 08 00:33:05.354251 master-0 kubenswrapper[23041]: I0308 00:33:05.354042 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 00:33:05.354914 master-0 kubenswrapper[23041]: I0308 00:33:05.354864 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 00:33:05.357405 master-0 kubenswrapper[23041]: I0308 00:33:05.357387 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 00:33:05.363261 master-0 kubenswrapper[23041]: I0308 00:33:05.358401 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.370914 master-0 kubenswrapper[23041]: I0308 00:33:05.370885 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:05.434250 master-0 kubenswrapper[23041]: I0308 00:33:05.432279 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.437606 master-0 kubenswrapper[23041]: I0308 00:33:05.437562 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.437758 master-0 kubenswrapper[23041]: I0308 00:33:05.437744 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.438012 master-0 kubenswrapper[23041]: I0308 00:33:05.437973 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.438409 master-0 kubenswrapper[23041]: I0308 00:33:05.438322 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.438486 master-0 kubenswrapper[23041]: I0308 00:33:05.438447 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.438695 master-0 kubenswrapper[23041]: I0308 00:33:05.438612 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.439000 master-0 kubenswrapper[23041]: I0308 00:33:05.438968 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdwt\" (UniqueName: \"kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.441896 master-0 kubenswrapper[23041]: I0308 00:33:05.439549 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.441896 master-0 kubenswrapper[23041]: I0308 00:33:05.440071 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.441896 master-0 kubenswrapper[23041]: I0308 00:33:05.440257 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.447677 master-0 kubenswrapper[23041]: I0308 00:33:05.445227 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.447677 master-0 kubenswrapper[23041]: I0308 00:33:05.445645 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.447677 master-0 kubenswrapper[23041]: I0308 00:33:05.446384 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.463815 master-0 kubenswrapper[23041]: I0308 00:33:05.463765 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.468939 master-0 kubenswrapper[23041]: I0308 00:33:05.468900 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdwt\" (UniqueName: \"kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt\") pod \"console-6dc96f5b89-ctlsc\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.495905 master-0 kubenswrapper[23041]: I0308 00:33:05.495844 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.518138 master-0 kubenswrapper[23041]: I0308 00:33:05.518092 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.542454 master-0 kubenswrapper[23041]: I0308 00:33:05.542335 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542699 master-0 kubenswrapper[23041]: I0308 00:33:05.542472 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542699 master-0 kubenswrapper[23041]: I0308 00:33:05.542503 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542699 master-0 kubenswrapper[23041]: I0308 00:33:05.542577 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542699 master-0 kubenswrapper[23041]: I0308 00:33:05.542651 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542838 master-0 kubenswrapper[23041]: I0308 00:33:05.542725 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542838 master-0 kubenswrapper[23041]: I0308 00:33:05.542788 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542909 master-0 kubenswrapper[23041]: I0308 00:33:05.542820 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542945 master-0 kubenswrapper[23041]: I0308 00:33:05.542907 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542982 master-0 kubenswrapper[23041]: I0308 00:33:05.542947 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.542982 master-0 kubenswrapper[23041]: I0308 00:33:05.542966 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.544135 master-0 kubenswrapper[23041]: I0308 00:33:05.544100 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.544135 master-0 kubenswrapper[23041]: I0308 00:33:05.544141 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.544344 master-0 kubenswrapper[23041]: I0308 00:33:05.544272 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.544344 master-0 kubenswrapper[23041]: I0308 00:33:05.544318 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.544414 master-0 kubenswrapper[23041]: I0308 00:33:05.544346 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.549238 master-0 kubenswrapper[23041]: I0308 00:33:05.544437 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.549238 master-0 kubenswrapper[23041]: I0308 00:33:05.548457 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ncm\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-kube-api-access-s5ncm\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.554670 master-0 kubenswrapper[23041]: I0308 00:33:05.554605 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.556504 master-0 kubenswrapper[23041]: E0308 00:33:05.556471 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.556575 master-0 kubenswrapper[23041]: I0308 00:33:05.556515 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.556575 master-0 kubenswrapper[23041]: I0308 00:33:05.556546 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.558983 master-0 kubenswrapper[23041]: E0308 00:33:05.558950 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.559061 master-0 kubenswrapper[23041]: I0308 00:33:05.558990 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.559061 master-0 kubenswrapper[23041]: I0308 00:33:05.559015 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.559561 master-0 kubenswrapper[23041]: E0308 00:33:05.559524 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.559641 master-0 kubenswrapper[23041]: I0308 00:33:05.559572 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.559641 master-0 kubenswrapper[23041]: I0308 00:33:05.559608 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.561337 master-0 kubenswrapper[23041]: E0308 00:33:05.561303 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.561419 master-0 kubenswrapper[23041]: I0308 00:33:05.561336 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.561419 master-0 kubenswrapper[23041]: I0308 00:33:05.561355 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.564478 master-0 kubenswrapper[23041]: E0308 00:33:05.564435 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.564478 master-0 kubenswrapper[23041]: I0308 00:33:05.564462 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.564478 master-0 kubenswrapper[23041]: I0308 00:33:05.564477 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.564941 master-0 kubenswrapper[23041]: E0308 00:33:05.564875 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.564998 master-0 kubenswrapper[23041]: I0308 00:33:05.564933 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.564998 master-0 kubenswrapper[23041]: I0308 00:33:05.564960 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.565319 master-0 kubenswrapper[23041]: E0308 00:33:05.565284 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.565319 master-0 kubenswrapper[23041]: I0308 00:33:05.565310 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.565427 master-0 kubenswrapper[23041]: I0308 00:33:05.565324 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.566326 master-0 kubenswrapper[23041]: I0308 00:33:05.565634 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.566326 master-0 kubenswrapper[23041]: I0308 00:33:05.565664 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.567035 master-0 kubenswrapper[23041]: I0308 00:33:05.567001 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.567035 master-0 kubenswrapper[23041]: I0308 00:33:05.567027 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.567865 master-0 kubenswrapper[23041]: I0308 00:33:05.567836 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.567865 master-0 kubenswrapper[23041]: I0308 00:33:05.567864 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.568257 master-0 kubenswrapper[23041]: I0308 00:33:05.568228 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.568257 master-0 kubenswrapper[23041]: I0308 00:33:05.568254 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.569704 master-0 kubenswrapper[23041]: I0308 00:33:05.569659 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.569790 master-0 kubenswrapper[23041]: I0308 00:33:05.569706 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.571872 master-0 kubenswrapper[23041]: I0308 00:33:05.571816 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.571872 master-0 kubenswrapper[23041]: I0308 00:33:05.571845 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.573266 master-0 kubenswrapper[23041]: I0308 00:33:05.573176 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.573266 master-0 kubenswrapper[23041]: I0308 00:33:05.573260 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.573660 master-0 kubenswrapper[23041]: I0308 00:33:05.573634 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.573736 master-0 kubenswrapper[23041]: I0308 00:33:05.573667 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.574068 master-0 kubenswrapper[23041]: I0308 00:33:05.574044 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.574068 master-0 kubenswrapper[23041]: I0308 00:33:05.574064 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.575257 master-0 kubenswrapper[23041]: I0308 00:33:05.575221 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.575257 master-0 kubenswrapper[23041]: I0308 00:33:05.575256 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.575575 master-0 kubenswrapper[23041]: I0308 00:33:05.575548 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.575575 master-0 kubenswrapper[23041]: I0308 00:33:05.575569 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.575967 master-0 kubenswrapper[23041]: I0308 00:33:05.575944 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.575967 master-0 kubenswrapper[23041]: I0308 00:33:05.575962 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.576234 master-0 kubenswrapper[23041]: I0308 00:33:05.576192 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.576234 master-0 kubenswrapper[23041]: I0308 00:33:05.576230 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.577060 master-0 kubenswrapper[23041]: I0308 00:33:05.577034 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.577060 master-0 kubenswrapper[23041]: I0308 00:33:05.577057 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.577352 master-0 kubenswrapper[23041]: I0308 00:33:05.577325 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.577352 master-0 kubenswrapper[23041]: I0308 00:33:05.577349 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.577609 master-0 kubenswrapper[23041]: I0308 00:33:05.577579 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.577609 master-0 kubenswrapper[23041]: I0308 00:33:05.577599 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.579486 master-0 kubenswrapper[23041]: I0308 00:33:05.578051 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.579486 master-0 kubenswrapper[23041]: I0308 00:33:05.578068 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.579486 master-0 kubenswrapper[23041]: I0308 00:33:05.578314 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.579486 master-0 kubenswrapper[23041]: I0308 00:33:05.578328 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.582830 master-0 kubenswrapper[23041]: I0308 00:33:05.580139 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.582830 master-0 kubenswrapper[23041]: I0308 00:33:05.580164 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.582830 master-0 kubenswrapper[23041]: I0308 00:33:05.580447 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.582830 master-0 kubenswrapper[23041]: I0308 00:33:05.580465 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.583890 master-0 kubenswrapper[23041]: I0308 00:33:05.583374 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.583890 master-0 kubenswrapper[23041]: I0308 00:33:05.583400 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.583890 master-0 kubenswrapper[23041]: I0308 00:33:05.583634 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.583890 master-0 kubenswrapper[23041]: I0308 00:33:05.583650 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.584050 master-0 kubenswrapper[23041]: I0308 00:33:05.584020 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.584050 master-0 kubenswrapper[23041]: I0308 00:33:05.584037 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.584738 master-0 kubenswrapper[23041]: I0308 00:33:05.584327 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.584738 master-0 kubenswrapper[23041]: I0308 00:33:05.584375 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.585710 master-0 kubenswrapper[23041]: I0308 00:33:05.585616 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.585710 master-0 kubenswrapper[23041]: I0308 00:33:05.585642 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.586635 master-0 kubenswrapper[23041]: I0308 00:33:05.586252 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.586635 master-0 kubenswrapper[23041]: I0308 00:33:05.586283 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.587164 master-0 kubenswrapper[23041]: I0308 00:33:05.587142 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.587164 master-0 kubenswrapper[23041]: I0308 00:33:05.587161 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.587431 master-0 kubenswrapper[23041]: I0308 00:33:05.587399 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.587431 master-0 kubenswrapper[23041]: I0308 00:33:05.587419 23041 scope.go:117] "RemoveContainer" containerID="341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d" Mar 08 00:33:05.591091 master-0 kubenswrapper[23041]: I0308 00:33:05.591051 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d"} err="failed to get container status \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": rpc error: code = NotFound desc = could not find container \"341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d\": container with ID starting with 341c117fc24206d0e3839ca57ba039836d1114be36d12fb34ba2f5b26e89d32d not found: ID does not exist" Mar 08 00:33:05.591091 master-0 kubenswrapper[23041]: I0308 00:33:05.591076 23041 scope.go:117] "RemoveContainer" containerID="dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c" Mar 08 00:33:05.591380 master-0 kubenswrapper[23041]: I0308 00:33:05.591340 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c"} err="failed to get container status \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": rpc error: code = NotFound desc = could not find container \"dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c\": container with ID starting with dd24c4f307672a1f949696856245eca4408f949ed84c121951e5aefd95a43d9c not found: ID does not exist" Mar 08 00:33:05.591380 master-0 kubenswrapper[23041]: I0308 00:33:05.591361 23041 scope.go:117] "RemoveContainer" containerID="4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a" Mar 08 00:33:05.591629 master-0 kubenswrapper[23041]: I0308 00:33:05.591594 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a"} err="failed to get container status \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": rpc error: code = NotFound desc = could not find container \"4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a\": container with ID starting with 4848d19ba0776826a9f40c021f719b52c6a6530dc063a074b5193621f8201b9a not found: ID does not exist" Mar 08 00:33:05.591629 master-0 kubenswrapper[23041]: I0308 00:33:05.591617 23041 scope.go:117] "RemoveContainer" containerID="1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3" Mar 08 00:33:05.591818 master-0 kubenswrapper[23041]: I0308 00:33:05.591794 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3"} err="failed to get container status \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": rpc error: code = NotFound desc = could not find container \"1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3\": container with ID starting with 1f43fdb6c0cc7204b6150506553af3176763f0599837371ad3b14f3a6dd6e9c3 not found: ID does not exist" Mar 08 00:33:05.591818 master-0 kubenswrapper[23041]: I0308 00:33:05.591814 23041 scope.go:117] "RemoveContainer" containerID="8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc" Mar 08 00:33:05.592030 master-0 kubenswrapper[23041]: I0308 00:33:05.591992 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc"} err="failed to get container status \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": rpc error: code = NotFound desc = could not find container \"8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc\": container with ID starting with 8677cf1f170f4b7104c5bee538c75b0cce6cadf668dc8807ee7dd7ead37ad2bc not found: ID does not exist" Mar 08 00:33:05.592030 master-0 kubenswrapper[23041]: I0308 00:33:05.592012 23041 scope.go:117] "RemoveContainer" containerID="4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39" Mar 08 00:33:05.592304 master-0 kubenswrapper[23041]: I0308 00:33:05.592230 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39"} err="failed to get container status \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": rpc error: code = NotFound desc = could not find container \"4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39\": container with ID starting with 4a516d7f08e2c00a73c82a6468c6733deb15cfb5496c5dc1c985cc5b2f878c39 not found: ID does not exist" Mar 08 00:33:05.592304 master-0 kubenswrapper[23041]: I0308 00:33:05.592251 23041 scope.go:117] "RemoveContainer" containerID="e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0" Mar 08 00:33:05.592477 master-0 kubenswrapper[23041]: I0308 00:33:05.592452 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0"} err="failed to get container status \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": rpc error: code = NotFound desc = could not find container \"e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0\": container with ID starting with e07552f51a5a17213c6ec3773800f0fdc0f57ef49b0e59da2864ad1ee82537b0 not found: ID does not exist" Mar 08 00:33:05.651895 master-0 kubenswrapper[23041]: I0308 00:33:05.651819 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.651923 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.651964 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ncm\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-kube-api-access-s5ncm\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.651992 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652023 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652054 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652099 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652128 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652176 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652220 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652237 master-0 kubenswrapper[23041]: I0308 00:33:05.652246 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652283 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652312 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652338 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652367 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652396 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652433 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.652582 master-0 kubenswrapper[23041]: I0308 00:33:05.652462 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.654562 master-0 kubenswrapper[23041]: I0308 00:33:05.654316 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.655304 master-0 kubenswrapper[23041]: I0308 00:33:05.655223 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.656086 master-0 kubenswrapper[23041]: I0308 00:33:05.655617 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.656381 master-0 kubenswrapper[23041]: I0308 00:33:05.656313 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.659145 master-0 kubenswrapper[23041]: I0308 00:33:05.656930 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.659145 master-0 kubenswrapper[23041]: I0308 00:33:05.657919 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.659145 master-0 kubenswrapper[23041]: I0308 00:33:05.659055 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.659588 master-0 kubenswrapper[23041]: I0308 00:33:05.659496 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.660538 master-0 kubenswrapper[23041]: I0308 00:33:05.660503 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.661045 master-0 kubenswrapper[23041]: I0308 00:33:05.660831 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.661474 master-0 kubenswrapper[23041]: I0308 00:33:05.661447 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.661535 master-0 kubenswrapper[23041]: I0308 00:33:05.661470 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.662077 master-0 kubenswrapper[23041]: I0308 00:33:05.662053 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.662367 master-0 kubenswrapper[23041]: I0308 00:33:05.662322 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.662731 master-0 kubenswrapper[23041]: I0308 00:33:05.662702 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.664054 master-0 kubenswrapper[23041]: I0308 00:33:05.664008 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796e5654-14f7-4309-9c9d-2a4f430bb9b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.665973 master-0 kubenswrapper[23041]: I0308 00:33:05.665955 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/796e5654-14f7-4309-9c9d-2a4f430bb9b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.668980 master-0 kubenswrapper[23041]: I0308 00:33:05.668840 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:05.673741 master-0 kubenswrapper[23041]: I0308 00:33:05.673657 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ncm\" (UniqueName: \"kubernetes.io/projected/796e5654-14f7-4309-9c9d-2a4f430bb9b4-kube-api-access-s5ncm\") pod \"prometheus-k8s-0\" (UID: \"796e5654-14f7-4309-9c9d-2a4f430bb9b4\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:05.711748 master-0 kubenswrapper[23041]: I0308 00:33:05.711697 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:06.104688 master-0 kubenswrapper[23041]: I0308 00:33:06.104619 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:33:06.108961 master-0 kubenswrapper[23041]: W0308 00:33:06.108885 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24264c1b_97df_4311_b7af_b205ac879381.slice/crio-7733ee7e1853723b50e6187da3137dfb190fcd44e2f676f7946fc9cd120c68b0 WatchSource:0}: Error finding container 7733ee7e1853723b50e6187da3137dfb190fcd44e2f676f7946fc9cd120c68b0: Status 404 returned error can't find the container with id 7733ee7e1853723b50e6187da3137dfb190fcd44e2f676f7946fc9cd120c68b0 Mar 08 00:33:06.239506 master-0 kubenswrapper[23041]: I0308 00:33:06.239454 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc96f5b89-ctlsc" event={"ID":"24264c1b-97df-4311-b7af-b205ac879381","Type":"ContainerStarted","Data":"7733ee7e1853723b50e6187da3137dfb190fcd44e2f676f7946fc9cd120c68b0"} Mar 08 00:33:06.241088 master-0 kubenswrapper[23041]: I0308 00:33:06.241040 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6787d8db86-xxqwp" event={"ID":"d31841e6-f09b-46b4-ac72-adf67f6a5327","Type":"ContainerStarted","Data":"1d0f00ce9a3921bbc8a2035897caa7360c33b6abb03b11c332493409919679d1"} Mar 08 00:33:06.241168 master-0 kubenswrapper[23041]: I0308 00:33:06.241102 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6787d8db86-xxqwp" event={"ID":"d31841e6-f09b-46b4-ac72-adf67f6a5327","Type":"ContainerStarted","Data":"a575d244c9acd5de4c26e975ba96ca89f30e4ed51c7be4043f93a0207f87ee94"} Mar 08 00:33:06.242332 master-0 kubenswrapper[23041]: I0308 00:33:06.242296 23041 generic.go:334] "Generic (PLEG): container finished" podID="588470ab-8c2f-4769-a09e-462b07c592fa" containerID="d822d296aeb37e9e7346cb98b3422e6f0893a2412a254c5f58eb25f81b5e1c71" exitCode=0 Mar 08 00:33:06.242454 master-0 kubenswrapper[23041]: I0308 00:33:06.242335 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerDied","Data":"d822d296aeb37e9e7346cb98b3422e6f0893a2412a254c5f58eb25f81b5e1c71"} Mar 08 00:33:06.242454 master-0 kubenswrapper[23041]: I0308 00:33:06.242357 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"494b3f60e6293acfe875a3edc933d5a0b10df1012e19995feaff519153de6932"} Mar 08 00:33:06.258258 master-0 kubenswrapper[23041]: I0308 00:33:06.257316 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 00:33:06.278589 master-0 kubenswrapper[23041]: I0308 00:33:06.277158 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6787d8db86-xxqwp" podStartSLOduration=2.277136309 podStartE2EDuration="2.277136309s" podCreationTimestamp="2026-03-08 00:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:06.275953859 +0000 UTC m=+91.748790423" watchObservedRunningTime="2026-03-08 00:33:06.277136309 +0000 UTC m=+91.749972883" Mar 08 00:33:06.776515 master-0 kubenswrapper[23041]: I0308 00:33:06.776447 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:06.830900 master-0 kubenswrapper[23041]: I0308 00:33:06.830088 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b9b4180-fc41-4072-9c61-0a35390a7ff3" path="/var/lib/kubelet/pods/4b9b4180-fc41-4072-9c61-0a35390a7ff3/volumes" Mar 08 00:33:06.831096 master-0 kubenswrapper[23041]: I0308 00:33:06.831064 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:33:06.832269 master-0 kubenswrapper[23041]: I0308 00:33:06.832073 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:33:06.832269 master-0 kubenswrapper[23041]: I0308 00:33:06.832183 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.888176 master-0 kubenswrapper[23041]: I0308 00:33:06.884657 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.891482 master-0 kubenswrapper[23041]: I0308 00:33:06.890556 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztglc\" (UniqueName: \"kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.891482 master-0 kubenswrapper[23041]: I0308 00:33:06.890679 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.891482 master-0 kubenswrapper[23041]: I0308 00:33:06.890847 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.891482 master-0 kubenswrapper[23041]: I0308 00:33:06.890968 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.891482 master-0 kubenswrapper[23041]: I0308 00:33:06.891142 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.892980 master-0 kubenswrapper[23041]: I0308 00:33:06.892439 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995093 master-0 kubenswrapper[23041]: I0308 00:33:06.994909 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995093 master-0 kubenswrapper[23041]: I0308 00:33:06.994972 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995093 master-0 kubenswrapper[23041]: I0308 00:33:06.995016 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995093 master-0 kubenswrapper[23041]: I0308 00:33:06.995047 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995538 master-0 kubenswrapper[23041]: I0308 00:33:06.995507 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995696 master-0 kubenswrapper[23041]: I0308 00:33:06.995645 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztglc\" (UniqueName: \"kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.995696 master-0 kubenswrapper[23041]: I0308 00:33:06.995670 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.997116 master-0 kubenswrapper[23041]: I0308 00:33:06.997075 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.997620 master-0 kubenswrapper[23041]: I0308 00:33:06.997585 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.997670 master-0 kubenswrapper[23041]: I0308 00:33:06.997585 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:06.997670 master-0 kubenswrapper[23041]: I0308 00:33:06.997659 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:07.004121 master-0 kubenswrapper[23041]: I0308 00:33:07.004056 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:07.004582 master-0 kubenswrapper[23041]: I0308 00:33:07.004552 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:07.016398 master-0 kubenswrapper[23041]: I0308 00:33:07.016189 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztglc\" (UniqueName: \"kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc\") pod \"console-6479f6d896-j6kqz\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:07.171393 master-0 kubenswrapper[23041]: I0308 00:33:07.171292 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:07.257935 master-0 kubenswrapper[23041]: I0308 00:33:07.257850 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc96f5b89-ctlsc" event={"ID":"24264c1b-97df-4311-b7af-b205ac879381","Type":"ContainerStarted","Data":"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6"} Mar 08 00:33:07.261265 master-0 kubenswrapper[23041]: I0308 00:33:07.260944 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"e15d9b87e72ccab832798f51af32ddabd5175008750715629fc430248c89ff4d"} Mar 08 00:33:07.262813 master-0 kubenswrapper[23041]: I0308 00:33:07.262760 23041 generic.go:334] "Generic (PLEG): container finished" podID="796e5654-14f7-4309-9c9d-2a4f430bb9b4" containerID="f69b5177b9aefff4a08d586f686800cdbea46bf9cc57b936ddfd2c48d5f7d279" exitCode=0 Mar 08 00:33:07.262890 master-0 kubenswrapper[23041]: I0308 00:33:07.262834 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerDied","Data":"f69b5177b9aefff4a08d586f686800cdbea46bf9cc57b936ddfd2c48d5f7d279"} Mar 08 00:33:07.262890 master-0 kubenswrapper[23041]: I0308 00:33:07.262872 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"23e2bcce9908a87e63c4883617fc6abe8caf0d728e5445cbace0d3bb4070623e"} Mar 08 00:33:07.283959 master-0 kubenswrapper[23041]: I0308 00:33:07.282486 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dc96f5b89-ctlsc" podStartSLOduration=2.2824619090000002 podStartE2EDuration="2.282461909s" podCreationTimestamp="2026-03-08 00:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:07.281574207 +0000 UTC m=+92.754410781" watchObservedRunningTime="2026-03-08 00:33:07.282461909 +0000 UTC m=+92.755298463" Mar 08 00:33:07.727225 master-0 kubenswrapper[23041]: I0308 00:33:07.727159 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:33:08.274413 master-0 kubenswrapper[23041]: I0308 00:33:08.274351 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"a845112fd9428157cb928ade7613275493c6574205e4ed2b1d8b19f73b668f69"} Mar 08 00:33:08.275029 master-0 kubenswrapper[23041]: I0308 00:33:08.275011 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"e3fa7e38521f8a8519dfe925266f011635261ca439b41dcf3cc401641fe4e448"} Mar 08 00:33:08.275185 master-0 kubenswrapper[23041]: I0308 00:33:08.275157 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"32af9d9942f0279e7f4864b34f84938d19c78c0bb8033ac658598454180da253"} Mar 08 00:33:08.275302 master-0 kubenswrapper[23041]: I0308 00:33:08.275285 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"f8082d694e723b55b35e65e9f0462d7b6720cbbe8ee99b8bdbb079405d396d0f"} Mar 08 00:33:08.275382 master-0 kubenswrapper[23041]: I0308 00:33:08.275370 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"588470ab-8c2f-4769-a09e-462b07c592fa","Type":"ContainerStarted","Data":"6d44048e684499c0f9fbc5df336ec2f3ab90bb3cf3543486b4e8433e826784bc"} Mar 08 00:33:08.276557 master-0 kubenswrapper[23041]: I0308 00:33:08.276519 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttpzw" event={"ID":"ab1c5c08-f159-4f15-8847-d39477b3c6e0","Type":"ContainerStarted","Data":"5f888c13bdb5be8ced4f55accfc7751c533542e30ce6dc2f86a804fe43070fbf"} Mar 08 00:33:08.278301 master-0 kubenswrapper[23041]: I0308 00:33:08.278270 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerStarted","Data":"42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf"} Mar 08 00:33:08.278301 master-0 kubenswrapper[23041]: I0308 00:33:08.278297 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerStarted","Data":"e794297247665d49796affaaf41c36b9c7d953b2c1882b909e7bef0eadebff8a"} Mar 08 00:33:08.282284 master-0 kubenswrapper[23041]: I0308 00:33:08.281465 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"66c16129d4486b527aeaf6e524034377b6a168602b4beb5f108250d7e33bc250"} Mar 08 00:33:08.282284 master-0 kubenswrapper[23041]: I0308 00:33:08.281499 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"12b10fb81f654a423cbc1c367fb58518f86f042bd83f433b9f844ffc50584dc2"} Mar 08 00:33:08.282284 master-0 kubenswrapper[23041]: I0308 00:33:08.281515 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"b222f543e61d347a5d51762d567771454bc01dc3068e2983e45cfdb4d35bd9c9"} Mar 08 00:33:08.282284 master-0 kubenswrapper[23041]: I0308 00:33:08.281528 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"d65a524600d9dae1656cf9568ee42808f9f9b25ed4cecec3c3ecc88a4461739b"} Mar 08 00:33:08.282284 master-0 kubenswrapper[23041]: I0308 00:33:08.281540 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"0bc2ec2b8a0411a85a567161aba78010a44109f763949e68dc66a8eab87c1903"} Mar 08 00:33:08.303380 master-0 kubenswrapper[23041]: I0308 00:33:08.303247 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 00:33:08.310098 master-0 kubenswrapper[23041]: I0308 00:33:08.310014 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.3099898549999995 podStartE2EDuration="4.309989855s" podCreationTimestamp="2026-03-08 00:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:08.30459787 +0000 UTC m=+93.777434454" watchObservedRunningTime="2026-03-08 00:33:08.309989855 +0000 UTC m=+93.782826429" Mar 08 00:33:08.313279 master-0 kubenswrapper[23041]: I0308 00:33:08.312369 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.318653 master-0 kubenswrapper[23041]: I0308 00:33:08.318610 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 00:33:08.318838 master-0 kubenswrapper[23041]: I0308 00:33:08.318657 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-bc67b" Mar 08 00:33:08.334350 master-0 kubenswrapper[23041]: I0308 00:33:08.332838 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 00:33:08.349906 master-0 kubenswrapper[23041]: I0308 00:33:08.346600 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6479f6d896-j6kqz" podStartSLOduration=2.346581151 podStartE2EDuration="2.346581151s" podCreationTimestamp="2026-03-08 00:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:08.345797711 +0000 UTC m=+93.818634255" watchObservedRunningTime="2026-03-08 00:33:08.346581151 +0000 UTC m=+93.819417705" Mar 08 00:33:08.368403 master-0 kubenswrapper[23041]: I0308 00:33:08.368182 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ttpzw" podStartSLOduration=2.490448741 podStartE2EDuration="5.368165571s" podCreationTimestamp="2026-03-08 00:33:03 +0000 UTC" firstStartedPulling="2026-03-08 00:33:04.47069091 +0000 UTC m=+89.943527464" lastFinishedPulling="2026-03-08 00:33:07.34840774 +0000 UTC m=+92.821244294" observedRunningTime="2026-03-08 00:33:08.366989622 +0000 UTC m=+93.839826186" watchObservedRunningTime="2026-03-08 00:33:08.368165571 +0000 UTC m=+93.841002125" Mar 08 00:33:08.438719 master-0 kubenswrapper[23041]: I0308 00:33:08.438674 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.439062 master-0 kubenswrapper[23041]: I0308 00:33:08.439044 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.439231 master-0 kubenswrapper[23041]: I0308 00:33:08.439216 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.541433 master-0 kubenswrapper[23041]: I0308 00:33:08.541353 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.541626 master-0 kubenswrapper[23041]: I0308 00:33:08.541475 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.541626 master-0 kubenswrapper[23041]: I0308 00:33:08.541526 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.541626 master-0 kubenswrapper[23041]: I0308 00:33:08.541622 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.541744 master-0 kubenswrapper[23041]: I0308 00:33:08.541551 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.561292 master-0 kubenswrapper[23041]: I0308 00:33:08.559898 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:08.662600 master-0 kubenswrapper[23041]: I0308 00:33:08.662526 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:09.142997 master-0 kubenswrapper[23041]: I0308 00:33:09.142905 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 00:33:09.146773 master-0 kubenswrapper[23041]: W0308 00:33:09.146631 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e9ee6f7_24ed_44b3_be57_a07a13e9e73b.slice/crio-28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c WatchSource:0}: Error finding container 28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c: Status 404 returned error can't find the container with id 28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c Mar 08 00:33:09.297067 master-0 kubenswrapper[23041]: I0308 00:33:09.297009 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"796e5654-14f7-4309-9c9d-2a4f430bb9b4","Type":"ContainerStarted","Data":"4827eae9929529c4f2e67de076fc7eda96e279ed521ec89c4bca106956a63044"} Mar 08 00:33:09.299506 master-0 kubenswrapper[23041]: I0308 00:33:09.299461 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b","Type":"ContainerStarted","Data":"28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c"} Mar 08 00:33:09.670928 master-0 kubenswrapper[23041]: I0308 00:33:09.670837 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.670819822 podStartE2EDuration="4.670819822s" podCreationTimestamp="2026-03-08 00:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:09.352997817 +0000 UTC m=+94.825834441" watchObservedRunningTime="2026-03-08 00:33:09.670819822 +0000 UTC m=+95.143656376" Mar 08 00:33:09.672847 master-0 kubenswrapper[23041]: I0308 00:33:09.672771 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 00:33:09.673859 master-0 kubenswrapper[23041]: I0308 00:33:09.673815 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.676517 master-0 kubenswrapper[23041]: I0308 00:33:09.676479 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-q6gf6" Mar 08 00:33:09.676650 master-0 kubenswrapper[23041]: I0308 00:33:09.676592 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 00:33:09.696155 master-0 kubenswrapper[23041]: I0308 00:33:09.696073 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 00:33:09.781896 master-0 kubenswrapper[23041]: I0308 00:33:09.781704 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.781896 master-0 kubenswrapper[23041]: I0308 00:33:09.781882 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.781896 master-0 kubenswrapper[23041]: I0308 00:33:09.781928 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.883852 master-0 kubenswrapper[23041]: I0308 00:33:09.883715 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.883852 master-0 kubenswrapper[23041]: I0308 00:33:09.883818 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.884097 master-0 kubenswrapper[23041]: I0308 00:33:09.883871 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.885873 master-0 kubenswrapper[23041]: I0308 00:33:09.885812 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.886371 master-0 kubenswrapper[23041]: I0308 00:33:09.886314 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:09.903932 master-0 kubenswrapper[23041]: I0308 00:33:09.903859 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:10.022595 master-0 kubenswrapper[23041]: I0308 00:33:10.022545 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:10.310593 master-0 kubenswrapper[23041]: I0308 00:33:10.310517 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b","Type":"ContainerStarted","Data":"6814cf059b67547841e8687e4684d5c2fadca0471bd82f7879f4d5d53180372c"} Mar 08 00:33:10.329517 master-0 kubenswrapper[23041]: I0308 00:33:10.329423 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.3294028239999998 podStartE2EDuration="2.329402824s" podCreationTimestamp="2026-03-08 00:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:10.329316522 +0000 UTC m=+95.802153096" watchObservedRunningTime="2026-03-08 00:33:10.329402824 +0000 UTC m=+95.802239378" Mar 08 00:33:10.555503 master-0 kubenswrapper[23041]: I0308 00:33:10.555394 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 00:33:10.561015 master-0 kubenswrapper[23041]: W0308 00:33:10.560982 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0af76e72_367d_4d11_8c55_8758aa5003dd.slice/crio-da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4 WatchSource:0}: Error finding container da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4: Status 404 returned error can't find the container with id da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4 Mar 08 00:33:10.712489 master-0 kubenswrapper[23041]: I0308 00:33:10.712420 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:33:11.317856 master-0 kubenswrapper[23041]: I0308 00:33:11.317779 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0af76e72-367d-4d11-8c55-8758aa5003dd","Type":"ContainerStarted","Data":"15306c11a0ba862de4c40e0fb25308bcbe28c92520dc173ad31eac204c8ec074"} Mar 08 00:33:11.318453 master-0 kubenswrapper[23041]: I0308 00:33:11.317900 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0af76e72-367d-4d11-8c55-8758aa5003dd","Type":"ContainerStarted","Data":"da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4"} Mar 08 00:33:11.344137 master-0 kubenswrapper[23041]: I0308 00:33:11.344057 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=2.344035527 podStartE2EDuration="2.344035527s" podCreationTimestamp="2026-03-08 00:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:11.343535324 +0000 UTC m=+96.816371918" watchObservedRunningTime="2026-03-08 00:33:11.344035527 +0000 UTC m=+96.816872081" Mar 08 00:33:14.211174 master-0 kubenswrapper[23041]: I0308 00:33:14.211104 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:33:14.212303 master-0 kubenswrapper[23041]: I0308 00:33:14.212279 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.216439 master-0 kubenswrapper[23041]: I0308 00:33:14.216368 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:33:14.216827 master-0 kubenswrapper[23041]: I0308 00:33:14.216782 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rml7" Mar 08 00:33:14.233891 master-0 kubenswrapper[23041]: I0308 00:33:14.233858 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:33:14.294285 master-0 kubenswrapper[23041]: I0308 00:33:14.292481 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.294285 master-0 kubenswrapper[23041]: I0308 00:33:14.292615 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.294285 master-0 kubenswrapper[23041]: I0308 00:33:14.292670 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.396236 master-0 kubenswrapper[23041]: I0308 00:33:14.394281 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.396236 master-0 kubenswrapper[23041]: I0308 00:33:14.394878 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.396236 master-0 kubenswrapper[23041]: I0308 00:33:14.395007 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.398541 master-0 kubenswrapper[23041]: I0308 00:33:14.397935 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.411805 master-0 kubenswrapper[23041]: I0308 00:33:14.411721 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.432328 master-0 kubenswrapper[23041]: I0308 00:33:14.432261 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access\") pod \"installer-4-master-0\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.565744 master-0 kubenswrapper[23041]: I0308 00:33:14.565631 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:33:14.765594 master-0 kubenswrapper[23041]: I0308 00:33:14.765496 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:15.002834 master-0 kubenswrapper[23041]: I0308 00:33:15.002772 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:33:15.010548 master-0 kubenswrapper[23041]: W0308 00:33:15.010416 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod861ba34f_5174_4835_a9b9_dbc5eacd2963.slice/crio-63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741 WatchSource:0}: Error finding container 63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741: Status 404 returned error can't find the container with id 63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741 Mar 08 00:33:15.355744 master-0 kubenswrapper[23041]: I0308 00:33:15.355666 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"861ba34f-5174-4835-a9b9-dbc5eacd2963","Type":"ContainerStarted","Data":"63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741"} Mar 08 00:33:15.669872 master-0 kubenswrapper[23041]: I0308 00:33:15.669770 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:15.670322 master-0 kubenswrapper[23041]: I0308 00:33:15.669896 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:15.671426 master-0 kubenswrapper[23041]: I0308 00:33:15.671345 23041 patch_prober.go:28] interesting pod/console-6dc96f5b89-ctlsc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Mar 08 00:33:15.671577 master-0 kubenswrapper[23041]: I0308 00:33:15.671456 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6dc96f5b89-ctlsc" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Mar 08 00:33:15.972756 master-0 kubenswrapper[23041]: I0308 00:33:15.972629 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_8b0395d1-7cb0-4857-891a-68f88a6fd468/installer/0.log" Mar 08 00:33:15.972756 master-0 kubenswrapper[23041]: I0308 00:33:15.972697 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:33:16.156994 master-0 kubenswrapper[23041]: I0308 00:33:16.156386 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access\") pod \"8b0395d1-7cb0-4857-891a-68f88a6fd468\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " Mar 08 00:33:16.156994 master-0 kubenswrapper[23041]: I0308 00:33:16.156437 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock\") pod \"8b0395d1-7cb0-4857-891a-68f88a6fd468\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " Mar 08 00:33:16.156994 master-0 kubenswrapper[23041]: I0308 00:33:16.156618 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock" (OuterVolumeSpecName: "var-lock") pod "8b0395d1-7cb0-4857-891a-68f88a6fd468" (UID: "8b0395d1-7cb0-4857-891a-68f88a6fd468"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:16.156994 master-0 kubenswrapper[23041]: I0308 00:33:16.156643 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir\") pod \"8b0395d1-7cb0-4857-891a-68f88a6fd468\" (UID: \"8b0395d1-7cb0-4857-891a-68f88a6fd468\") " Mar 08 00:33:16.156994 master-0 kubenswrapper[23041]: I0308 00:33:16.156730 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b0395d1-7cb0-4857-891a-68f88a6fd468" (UID: "8b0395d1-7cb0-4857-891a-68f88a6fd468"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:16.157346 master-0 kubenswrapper[23041]: I0308 00:33:16.157044 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:16.157346 master-0 kubenswrapper[23041]: I0308 00:33:16.157060 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0395d1-7cb0-4857-891a-68f88a6fd468-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:16.159999 master-0 kubenswrapper[23041]: I0308 00:33:16.159963 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b0395d1-7cb0-4857-891a-68f88a6fd468" (UID: "8b0395d1-7cb0-4857-891a-68f88a6fd468"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:16.259622 master-0 kubenswrapper[23041]: I0308 00:33:16.259545 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0395d1-7cb0-4857-891a-68f88a6fd468-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:16.366156 master-0 kubenswrapper[23041]: I0308 00:33:16.365989 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"861ba34f-5174-4835-a9b9-dbc5eacd2963","Type":"ContainerStarted","Data":"ad59f0ee4ace09dae79cfc40c750720203b39cdfecc33e32dfaa1834966aad3c"} Mar 08 00:33:16.368367 master-0 kubenswrapper[23041]: I0308 00:33:16.368309 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_8b0395d1-7cb0-4857-891a-68f88a6fd468/installer/0.log" Mar 08 00:33:16.368476 master-0 kubenswrapper[23041]: I0308 00:33:16.368390 23041 generic.go:334] "Generic (PLEG): container finished" podID="8b0395d1-7cb0-4857-891a-68f88a6fd468" containerID="4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888" exitCode=1 Mar 08 00:33:16.368699 master-0 kubenswrapper[23041]: I0308 00:33:16.368511 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"8b0395d1-7cb0-4857-891a-68f88a6fd468","Type":"ContainerDied","Data":"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888"} Mar 08 00:33:16.368828 master-0 kubenswrapper[23041]: I0308 00:33:16.368786 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"8b0395d1-7cb0-4857-891a-68f88a6fd468","Type":"ContainerDied","Data":"12650fbaafb1ba4d82ecfbd4886d512f14868b88e1328c1261518868d687ae5b"} Mar 08 00:33:16.369025 master-0 kubenswrapper[23041]: I0308 00:33:16.368802 23041 scope.go:117] "RemoveContainer" containerID="4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888" Mar 08 00:33:16.369553 master-0 kubenswrapper[23041]: I0308 00:33:16.369493 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 00:33:16.392041 master-0 kubenswrapper[23041]: I0308 00:33:16.391995 23041 scope.go:117] "RemoveContainer" containerID="4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888" Mar 08 00:33:16.392584 master-0 kubenswrapper[23041]: E0308 00:33:16.392540 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888\": container with ID starting with 4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888 not found: ID does not exist" containerID="4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888" Mar 08 00:33:16.392669 master-0 kubenswrapper[23041]: I0308 00:33:16.392596 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888"} err="failed to get container status \"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888\": rpc error: code = NotFound desc = could not find container \"4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888\": container with ID starting with 4026150a6114ab872a133f085acdad1899f7a88076a111da9b83e693d0a1a888 not found: ID does not exist" Mar 08 00:33:16.393647 master-0 kubenswrapper[23041]: I0308 00:33:16.393561 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.393535638 podStartE2EDuration="2.393535638s" podCreationTimestamp="2026-03-08 00:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:16.386460861 +0000 UTC m=+101.859297405" watchObservedRunningTime="2026-03-08 00:33:16.393535638 +0000 UTC m=+101.866372202" Mar 08 00:33:16.419412 master-0 kubenswrapper[23041]: I0308 00:33:16.419307 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:33:16.427584 master-0 kubenswrapper[23041]: I0308 00:33:16.427480 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 00:33:16.823973 master-0 kubenswrapper[23041]: I0308 00:33:16.823897 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0395d1-7cb0-4857-891a-68f88a6fd468" path="/var/lib/kubelet/pods/8b0395d1-7cb0-4857-891a-68f88a6fd468/volumes" Mar 08 00:33:17.172418 master-0 kubenswrapper[23041]: I0308 00:33:17.172281 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:17.172418 master-0 kubenswrapper[23041]: I0308 00:33:17.172350 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:33:17.174455 master-0 kubenswrapper[23041]: I0308 00:33:17.174400 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:33:17.174541 master-0 kubenswrapper[23041]: I0308 00:33:17.174487 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:33:23.796213 master-0 kubenswrapper[23041]: I0308 00:33:23.796115 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:33:23.797342 master-0 kubenswrapper[23041]: I0308 00:33:23.796405 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="74512190-22e4-4648-8d1e-e487de48a124" containerName="installer" containerID="cri-o://1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794" gracePeriod=30 Mar 08 00:33:25.670515 master-0 kubenswrapper[23041]: I0308 00:33:25.670452 23041 patch_prober.go:28] interesting pod/console-6dc96f5b89-ctlsc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Mar 08 00:33:25.672300 master-0 kubenswrapper[23041]: I0308 00:33:25.670522 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6dc96f5b89-ctlsc" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Mar 08 00:33:27.000854 master-0 kubenswrapper[23041]: I0308 00:33:27.000787 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 00:33:27.001637 master-0 kubenswrapper[23041]: E0308 00:33:27.001163 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0395d1-7cb0-4857-891a-68f88a6fd468" containerName="installer" Mar 08 00:33:27.001637 master-0 kubenswrapper[23041]: I0308 00:33:27.001179 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0395d1-7cb0-4857-891a-68f88a6fd468" containerName="installer" Mar 08 00:33:27.001637 master-0 kubenswrapper[23041]: I0308 00:33:27.001442 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0395d1-7cb0-4857-891a-68f88a6fd468" containerName="installer" Mar 08 00:33:27.002009 master-0 kubenswrapper[23041]: I0308 00:33:27.001976 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.017765 master-0 kubenswrapper[23041]: I0308 00:33:27.017702 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 00:33:27.085398 master-0 kubenswrapper[23041]: I0308 00:33:27.085310 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.085398 master-0 kubenswrapper[23041]: I0308 00:33:27.085376 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.085681 master-0 kubenswrapper[23041]: I0308 00:33:27.085632 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.172675 master-0 kubenswrapper[23041]: I0308 00:33:27.172612 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:33:27.172933 master-0 kubenswrapper[23041]: I0308 00:33:27.172686 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:33:27.187004 master-0 kubenswrapper[23041]: I0308 00:33:27.186952 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.187118 master-0 kubenswrapper[23041]: I0308 00:33:27.187018 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.187320 master-0 kubenswrapper[23041]: I0308 00:33:27.187291 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.187413 master-0 kubenswrapper[23041]: I0308 00:33:27.187387 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.187455 master-0 kubenswrapper[23041]: I0308 00:33:27.187407 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.203519 master-0 kubenswrapper[23041]: I0308 00:33:27.203496 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.345646 master-0 kubenswrapper[23041]: I0308 00:33:27.345513 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:33:27.800764 master-0 kubenswrapper[23041]: I0308 00:33:27.800707 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 00:33:27.805634 master-0 kubenswrapper[23041]: W0308 00:33:27.805595 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod343c30a5_7bf7_49ef_a224_c39ca46a63f1.slice/crio-b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6 WatchSource:0}: Error finding container b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6: Status 404 returned error can't find the container with id b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6 Mar 08 00:33:28.495988 master-0 kubenswrapper[23041]: I0308 00:33:28.495889 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"343c30a5-7bf7-49ef-a224-c39ca46a63f1","Type":"ContainerStarted","Data":"42e56510331a27f30c67dafb2ca2fefb858a01f489046e8ecd0c02cc5211b70c"} Mar 08 00:33:28.496544 master-0 kubenswrapper[23041]: I0308 00:33:28.496003 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"343c30a5-7bf7-49ef-a224-c39ca46a63f1","Type":"ContainerStarted","Data":"b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6"} Mar 08 00:33:28.515971 master-0 kubenswrapper[23041]: I0308 00:33:28.515890 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=2.515867022 podStartE2EDuration="2.515867022s" podCreationTimestamp="2026-03-08 00:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:28.512793325 +0000 UTC m=+113.985629909" watchObservedRunningTime="2026-03-08 00:33:28.515867022 +0000 UTC m=+113.988703616" Mar 08 00:33:28.602730 master-0 kubenswrapper[23041]: I0308 00:33:28.602668 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-69bg7"] Mar 08 00:33:28.604272 master-0 kubenswrapper[23041]: I0308 00:33:28.604206 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:28.607526 master-0 kubenswrapper[23041]: I0308 00:33:28.607486 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:33:28.607629 master-0 kubenswrapper[23041]: I0308 00:33:28.607589 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:33:28.618640 master-0 kubenswrapper[23041]: I0308 00:33:28.618542 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-69bg7"] Mar 08 00:33:28.715103 master-0 kubenswrapper[23041]: I0308 00:33:28.714741 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:28.715103 master-0 kubenswrapper[23041]: I0308 00:33:28.714820 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb8b3618-a301-40d4-b617-f9b57afa555c-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:28.816295 master-0 kubenswrapper[23041]: I0308 00:33:28.816176 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:28.816295 master-0 kubenswrapper[23041]: I0308 00:33:28.816257 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb8b3618-a301-40d4-b617-f9b57afa555c-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:28.816502 master-0 kubenswrapper[23041]: E0308 00:33:28.816466 23041 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 08 00:33:28.816592 master-0 kubenswrapper[23041]: E0308 00:33:28.816561 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert podName:eb8b3618-a301-40d4-b617-f9b57afa555c nodeName:}" failed. No retries permitted until 2026-03-08 00:33:29.316533092 +0000 UTC m=+114.789369686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-69bg7" (UID: "eb8b3618-a301-40d4-b617-f9b57afa555c") : secret "networking-console-plugin-cert" not found Mar 08 00:33:28.817080 master-0 kubenswrapper[23041]: I0308 00:33:28.817050 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb8b3618-a301-40d4-b617-f9b57afa555c-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:29.051277 master-0 kubenswrapper[23041]: I0308 00:33:29.051186 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:33:29.095388 master-0 kubenswrapper[23041]: I0308 00:33:29.095160 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:33:29.096873 master-0 kubenswrapper[23041]: I0308 00:33:29.096850 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.113889 master-0 kubenswrapper[23041]: I0308 00:33:29.113814 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:33:29.124990 master-0 kubenswrapper[23041]: I0308 00:33:29.121584 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.125936 master-0 kubenswrapper[23041]: I0308 00:33:29.125811 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.125936 master-0 kubenswrapper[23041]: I0308 00:33:29.125908 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.126096 master-0 kubenswrapper[23041]: I0308 00:33:29.126056 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.126152 master-0 kubenswrapper[23041]: I0308 00:33:29.126123 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gm8\" (UniqueName: \"kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.126283 master-0 kubenswrapper[23041]: I0308 00:33:29.126236 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.126362 master-0 kubenswrapper[23041]: I0308 00:33:29.126343 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.228498 master-0 kubenswrapper[23041]: I0308 00:33:29.228450 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.229573 master-0 kubenswrapper[23041]: I0308 00:33:29.229557 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.230312 master-0 kubenswrapper[23041]: I0308 00:33:29.230281 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gm8\" (UniqueName: \"kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.230712 master-0 kubenswrapper[23041]: I0308 00:33:29.230697 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.231163 master-0 kubenswrapper[23041]: I0308 00:33:29.231148 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.231321 master-0 kubenswrapper[23041]: I0308 00:33:29.231304 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.231501 master-0 kubenswrapper[23041]: I0308 00:33:29.231487 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.231820 master-0 kubenswrapper[23041]: I0308 00:33:29.230244 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.231820 master-0 kubenswrapper[23041]: I0308 00:33:29.229506 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.232396 master-0 kubenswrapper[23041]: I0308 00:33:29.232380 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.232996 master-0 kubenswrapper[23041]: I0308 00:33:29.232941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.233588 master-0 kubenswrapper[23041]: I0308 00:33:29.233572 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.234628 master-0 kubenswrapper[23041]: I0308 00:33:29.234586 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.251979 master-0 kubenswrapper[23041]: I0308 00:33:29.251930 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gm8\" (UniqueName: \"kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8\") pod \"console-c45bf598-vngbg\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.280350 master-0 kubenswrapper[23041]: I0308 00:33:29.280263 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5c84b9c874-8xl2l" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" containerID="cri-o://0a7b7a4bad54508072eb15ff1d869aaa4adb172a39c8eccea3f65180f4f8b0c7" gracePeriod=15 Mar 08 00:33:29.333422 master-0 kubenswrapper[23041]: I0308 00:33:29.333354 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:29.337675 master-0 kubenswrapper[23041]: I0308 00:33:29.337600 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb8b3618-a301-40d4-b617-f9b57afa555c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-69bg7\" (UID: \"eb8b3618-a301-40d4-b617-f9b57afa555c\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:29.433181 master-0 kubenswrapper[23041]: I0308 00:33:29.433033 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:29.507404 master-0 kubenswrapper[23041]: I0308 00:33:29.507327 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c84b9c874-8xl2l_3baca04d-be92-4f02-8ea9-94cc37fc00b4/console/0.log" Mar 08 00:33:29.508073 master-0 kubenswrapper[23041]: I0308 00:33:29.507414 23041 generic.go:334] "Generic (PLEG): container finished" podID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerID="0a7b7a4bad54508072eb15ff1d869aaa4adb172a39c8eccea3f65180f4f8b0c7" exitCode=2 Mar 08 00:33:29.508898 master-0 kubenswrapper[23041]: I0308 00:33:29.508855 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c84b9c874-8xl2l" event={"ID":"3baca04d-be92-4f02-8ea9-94cc37fc00b4","Type":"ContainerDied","Data":"0a7b7a4bad54508072eb15ff1d869aaa4adb172a39c8eccea3f65180f4f8b0c7"} Mar 08 00:33:29.542762 master-0 kubenswrapper[23041]: I0308 00:33:29.540187 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" Mar 08 00:33:29.772201 master-0 kubenswrapper[23041]: I0308 00:33:29.772137 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c84b9c874-8xl2l_3baca04d-be92-4f02-8ea9-94cc37fc00b4/console/0.log" Mar 08 00:33:29.772420 master-0 kubenswrapper[23041]: I0308 00:33:29.772252 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.849796 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.849846 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.849874 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk28b\" (UniqueName: \"kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.849901 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.850489 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config" (OuterVolumeSpecName: "console-config") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.850531 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.850584 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca\") pod \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\" (UID: \"3baca04d-be92-4f02-8ea9-94cc37fc00b4\") " Mar 08 00:33:29.850827 master-0 kubenswrapper[23041]: I0308 00:33:29.850818 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:29.851272 master-0 kubenswrapper[23041]: I0308 00:33:29.851046 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:29.851272 master-0 kubenswrapper[23041]: I0308 00:33:29.851062 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:29.851416 master-0 kubenswrapper[23041]: I0308 00:33:29.851392 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca" (OuterVolumeSpecName: "service-ca") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:29.853932 master-0 kubenswrapper[23041]: I0308 00:33:29.853888 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:29.854282 master-0 kubenswrapper[23041]: I0308 00:33:29.854179 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b" (OuterVolumeSpecName: "kube-api-access-xk28b") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "kube-api-access-xk28b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:29.854282 master-0 kubenswrapper[23041]: I0308 00:33:29.854172 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3baca04d-be92-4f02-8ea9-94cc37fc00b4" (UID: "3baca04d-be92-4f02-8ea9-94cc37fc00b4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:29.925280 master-0 kubenswrapper[23041]: W0308 00:33:29.925239 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3dba51_1f0c_4cd0_8280_58b1a50bb0ac.slice/crio-ece44a4b47794be7785e88f5603d60806f3a9b959a0c0021450bc2009700cb87 WatchSource:0}: Error finding container ece44a4b47794be7785e88f5603d60806f3a9b959a0c0021450bc2009700cb87: Status 404 returned error can't find the container with id ece44a4b47794be7785e88f5603d60806f3a9b959a0c0021450bc2009700cb87 Mar 08 00:33:29.927883 master-0 kubenswrapper[23041]: I0308 00:33:29.927834 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:33:29.952545 master-0 kubenswrapper[23041]: I0308 00:33:29.952498 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:29.952545 master-0 kubenswrapper[23041]: I0308 00:33:29.952535 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk28b\" (UniqueName: \"kubernetes.io/projected/3baca04d-be92-4f02-8ea9-94cc37fc00b4-kube-api-access-xk28b\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:29.952545 master-0 kubenswrapper[23041]: I0308 00:33:29.952544 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3baca04d-be92-4f02-8ea9-94cc37fc00b4-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:29.952703 master-0 kubenswrapper[23041]: I0308 00:33:29.952555 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3baca04d-be92-4f02-8ea9-94cc37fc00b4-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.004501 master-0 kubenswrapper[23041]: I0308 00:33:30.004449 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-69bg7"] Mar 08 00:33:30.256549 master-0 kubenswrapper[23041]: I0308 00:33:30.256462 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-76c777474b-n9mhf" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" containerID="cri-o://fb16edc75fe138e856a4f392208e9dde4e0eff1ea9fd011ed9da97c48fdc468f" gracePeriod=15 Mar 08 00:33:30.514688 master-0 kubenswrapper[23041]: I0308 00:33:30.514564 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" event={"ID":"eb8b3618-a301-40d4-b617-f9b57afa555c","Type":"ContainerStarted","Data":"d709549f205961e65e86963a3bd90b93be995666b1e35da106cbfa364ab744be"} Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.515705 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c84b9c874-8xl2l_3baca04d-be92-4f02-8ea9-94cc37fc00b4/console/0.log" Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.515832 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c84b9c874-8xl2l" Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.516320 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c84b9c874-8xl2l" event={"ID":"3baca04d-be92-4f02-8ea9-94cc37fc00b4","Type":"ContainerDied","Data":"7dedc3a693168997bbdf5b41d54540ad18fe948792d8dc4a9093bd38445a352b"} Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.516374 23041 scope.go:117] "RemoveContainer" containerID="0a7b7a4bad54508072eb15ff1d869aaa4adb172a39c8eccea3f65180f4f8b0c7" Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.517884 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c777474b-n9mhf_136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca/console/0.log" Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.517917 23041 generic.go:334] "Generic (PLEG): container finished" podID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerID="fb16edc75fe138e856a4f392208e9dde4e0eff1ea9fd011ed9da97c48fdc468f" exitCode=2 Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.517936 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c777474b-n9mhf" event={"ID":"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca","Type":"ContainerDied","Data":"fb16edc75fe138e856a4f392208e9dde4e0eff1ea9fd011ed9da97c48fdc468f"} Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.519789 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerStarted","Data":"54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24"} Mar 08 00:33:30.520224 master-0 kubenswrapper[23041]: I0308 00:33:30.519818 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerStarted","Data":"ece44a4b47794be7785e88f5603d60806f3a9b959a0c0021450bc2009700cb87"} Mar 08 00:33:30.551031 master-0 kubenswrapper[23041]: I0308 00:33:30.550889 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c45bf598-vngbg" podStartSLOduration=1.550862196 podStartE2EDuration="1.550862196s" podCreationTimestamp="2026-03-08 00:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:33:30.540059164 +0000 UTC m=+116.012895718" watchObservedRunningTime="2026-03-08 00:33:30.550862196 +0000 UTC m=+116.023698750" Mar 08 00:33:30.559335 master-0 kubenswrapper[23041]: I0308 00:33:30.557225 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:33:30.566995 master-0 kubenswrapper[23041]: I0308 00:33:30.566908 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c84b9c874-8xl2l"] Mar 08 00:33:30.770470 master-0 kubenswrapper[23041]: I0308 00:33:30.770309 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c777474b-n9mhf_136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca/console/0.log" Mar 08 00:33:30.771431 master-0 kubenswrapper[23041]: I0308 00:33:30.771409 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:33:30.819683 master-0 kubenswrapper[23041]: I0308 00:33:30.819603 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" path="/var/lib/kubelet/pods/3baca04d-be92-4f02-8ea9-94cc37fc00b4/volumes" Mar 08 00:33:30.880787 master-0 kubenswrapper[23041]: I0308 00:33:30.880719 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.880787 master-0 kubenswrapper[23041]: I0308 00:33:30.880796 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.881161 master-0 kubenswrapper[23041]: I0308 00:33:30.880868 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.881161 master-0 kubenswrapper[23041]: I0308 00:33:30.880889 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.881161 master-0 kubenswrapper[23041]: I0308 00:33:30.880950 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzhf\" (UniqueName: \"kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.881161 master-0 kubenswrapper[23041]: I0308 00:33:30.881042 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.881161 master-0 kubenswrapper[23041]: I0308 00:33:30.881115 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert\") pod \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\" (UID: \"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca\") " Mar 08 00:33:30.882483 master-0 kubenswrapper[23041]: I0308 00:33:30.882446 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:30.882637 master-0 kubenswrapper[23041]: I0308 00:33:30.882568 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config" (OuterVolumeSpecName: "console-config") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:30.882764 master-0 kubenswrapper[23041]: I0308 00:33:30.882730 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:30.883496 master-0 kubenswrapper[23041]: I0308 00:33:30.883445 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:30.884845 master-0 kubenswrapper[23041]: I0308 00:33:30.884812 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf" (OuterVolumeSpecName: "kube-api-access-hxzhf") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "kube-api-access-hxzhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:30.885280 master-0 kubenswrapper[23041]: I0308 00:33:30.885227 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:30.885625 master-0 kubenswrapper[23041]: I0308 00:33:30.885589 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" (UID: "136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:30.983647 master-0 kubenswrapper[23041]: I0308 00:33:30.983497 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983647 master-0 kubenswrapper[23041]: I0308 00:33:30.983664 23041 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983979 master-0 kubenswrapper[23041]: I0308 00:33:30.983677 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983979 master-0 kubenswrapper[23041]: I0308 00:33:30.983689 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983979 master-0 kubenswrapper[23041]: I0308 00:33:30.983699 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzhf\" (UniqueName: \"kubernetes.io/projected/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-kube-api-access-hxzhf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983979 master-0 kubenswrapper[23041]: I0308 00:33:30.983708 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:30.983979 master-0 kubenswrapper[23041]: I0308 00:33:30.983720 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:31.532943 master-0 kubenswrapper[23041]: I0308 00:33:31.532006 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c777474b-n9mhf_136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca/console/0.log" Mar 08 00:33:31.532943 master-0 kubenswrapper[23041]: I0308 00:33:31.532125 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c777474b-n9mhf" Mar 08 00:33:31.532943 master-0 kubenswrapper[23041]: I0308 00:33:31.532159 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c777474b-n9mhf" event={"ID":"136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca","Type":"ContainerDied","Data":"408aa627a4a08d14d63ec8aea7bfc7777355f299cda7fec6881e53324a2338e6"} Mar 08 00:33:31.532943 master-0 kubenswrapper[23041]: I0308 00:33:31.532192 23041 scope.go:117] "RemoveContainer" containerID="fb16edc75fe138e856a4f392208e9dde4e0eff1ea9fd011ed9da97c48fdc468f" Mar 08 00:33:31.590397 master-0 kubenswrapper[23041]: I0308 00:33:31.590323 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:33:31.592811 master-0 kubenswrapper[23041]: I0308 00:33:31.592731 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c777474b-n9mhf"] Mar 08 00:33:32.557581 master-0 kubenswrapper[23041]: I0308 00:33:32.557475 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" event={"ID":"eb8b3618-a301-40d4-b617-f9b57afa555c","Type":"ContainerStarted","Data":"f8fd8fe8a52578b5b1dfcf6b049dc8efbac90eeb53b1da299eb07c04f49a4817"} Mar 08 00:33:32.580636 master-0 kubenswrapper[23041]: I0308 00:33:32.578319 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-69bg7" podStartSLOduration=3.153606792 podStartE2EDuration="4.57829699s" podCreationTimestamp="2026-03-08 00:33:28 +0000 UTC" firstStartedPulling="2026-03-08 00:33:30.028123586 +0000 UTC m=+115.500960150" lastFinishedPulling="2026-03-08 00:33:31.452813794 +0000 UTC m=+116.925650348" observedRunningTime="2026-03-08 00:33:32.575865829 +0000 UTC m=+118.048702473" watchObservedRunningTime="2026-03-08 00:33:32.57829699 +0000 UTC m=+118.051133544" Mar 08 00:33:32.819433 master-0 kubenswrapper[23041]: I0308 00:33:32.819284 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" path="/var/lib/kubelet/pods/136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca/volumes" Mar 08 00:33:33.319681 master-0 kubenswrapper[23041]: I0308 00:33:33.319613 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6787d8db86-xxqwp" podUID="d31841e6-f09b-46b4-ac72-adf67f6a5327" containerName="console" containerID="cri-o://1d0f00ce9a3921bbc8a2035897caa7360c33b6abb03b11c332493409919679d1" gracePeriod=15 Mar 08 00:33:33.582647 master-0 kubenswrapper[23041]: I0308 00:33:33.582489 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6787d8db86-xxqwp_d31841e6-f09b-46b4-ac72-adf67f6a5327/console/0.log" Mar 08 00:33:33.582647 master-0 kubenswrapper[23041]: I0308 00:33:33.582564 23041 generic.go:334] "Generic (PLEG): container finished" podID="d31841e6-f09b-46b4-ac72-adf67f6a5327" containerID="1d0f00ce9a3921bbc8a2035897caa7360c33b6abb03b11c332493409919679d1" exitCode=2 Mar 08 00:33:33.583945 master-0 kubenswrapper[23041]: I0308 00:33:33.582674 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6787d8db86-xxqwp" event={"ID":"d31841e6-f09b-46b4-ac72-adf67f6a5327","Type":"ContainerDied","Data":"1d0f00ce9a3921bbc8a2035897caa7360c33b6abb03b11c332493409919679d1"} Mar 08 00:33:33.813076 master-0 kubenswrapper[23041]: I0308 00:33:33.813015 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6787d8db86-xxqwp_d31841e6-f09b-46b4-ac72-adf67f6a5327/console/0.log" Mar 08 00:33:33.813298 master-0 kubenswrapper[23041]: I0308 00:33:33.813088 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:33.948513 master-0 kubenswrapper[23041]: I0308 00:33:33.948321 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.948513 master-0 kubenswrapper[23041]: I0308 00:33:33.948412 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.948513 master-0 kubenswrapper[23041]: I0308 00:33:33.948482 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwr2\" (UniqueName: \"kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.948513 master-0 kubenswrapper[23041]: I0308 00:33:33.948511 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.949138 master-0 kubenswrapper[23041]: I0308 00:33:33.948564 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.949138 master-0 kubenswrapper[23041]: I0308 00:33:33.948970 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.949138 master-0 kubenswrapper[23041]: I0308 00:33:33.949130 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config" (OuterVolumeSpecName: "console-config") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:33.949376 master-0 kubenswrapper[23041]: I0308 00:33:33.949189 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:33.949376 master-0 kubenswrapper[23041]: I0308 00:33:33.949271 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert\") pod \"d31841e6-f09b-46b4-ac72-adf67f6a5327\" (UID: \"d31841e6-f09b-46b4-ac72-adf67f6a5327\") " Mar 08 00:33:33.949639 master-0 kubenswrapper[23041]: I0308 00:33:33.949608 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca" (OuterVolumeSpecName: "service-ca") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:33.949720 master-0 kubenswrapper[23041]: I0308 00:33:33.949626 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:33.949851 master-0 kubenswrapper[23041]: I0308 00:33:33.949828 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:33.949851 master-0 kubenswrapper[23041]: I0308 00:33:33.949846 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:33.949982 master-0 kubenswrapper[23041]: I0308 00:33:33.949859 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:33.949982 master-0 kubenswrapper[23041]: I0308 00:33:33.949872 23041 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31841e6-f09b-46b4-ac72-adf67f6a5327-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:33.951553 master-0 kubenswrapper[23041]: I0308 00:33:33.951484 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2" (OuterVolumeSpecName: "kube-api-access-hhwr2") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "kube-api-access-hhwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:33.952299 master-0 kubenswrapper[23041]: I0308 00:33:33.952196 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:33.953038 master-0 kubenswrapper[23041]: I0308 00:33:33.952984 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d31841e6-f09b-46b4-ac72-adf67f6a5327" (UID: "d31841e6-f09b-46b4-ac72-adf67f6a5327"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:34.051854 master-0 kubenswrapper[23041]: I0308 00:33:34.051758 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:34.051854 master-0 kubenswrapper[23041]: I0308 00:33:34.051801 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31841e6-f09b-46b4-ac72-adf67f6a5327-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:34.051854 master-0 kubenswrapper[23041]: I0308 00:33:34.051811 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwr2\" (UniqueName: \"kubernetes.io/projected/d31841e6-f09b-46b4-ac72-adf67f6a5327-kube-api-access-hhwr2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:34.594948 master-0 kubenswrapper[23041]: I0308 00:33:34.594851 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6787d8db86-xxqwp_d31841e6-f09b-46b4-ac72-adf67f6a5327/console/0.log" Mar 08 00:33:34.595959 master-0 kubenswrapper[23041]: I0308 00:33:34.594966 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6787d8db86-xxqwp" event={"ID":"d31841e6-f09b-46b4-ac72-adf67f6a5327","Type":"ContainerDied","Data":"a575d244c9acd5de4c26e975ba96ca89f30e4ed51c7be4043f93a0207f87ee94"} Mar 08 00:33:34.595959 master-0 kubenswrapper[23041]: I0308 00:33:34.595047 23041 scope.go:117] "RemoveContainer" containerID="1d0f00ce9a3921bbc8a2035897caa7360c33b6abb03b11c332493409919679d1" Mar 08 00:33:34.595959 master-0 kubenswrapper[23041]: I0308 00:33:34.595084 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6787d8db86-xxqwp" Mar 08 00:33:34.647526 master-0 kubenswrapper[23041]: I0308 00:33:34.647438 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:34.655738 master-0 kubenswrapper[23041]: I0308 00:33:34.655672 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6787d8db86-xxqwp"] Mar 08 00:33:34.832393 master-0 kubenswrapper[23041]: I0308 00:33:34.832269 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31841e6-f09b-46b4-ac72-adf67f6a5327" path="/var/lib/kubelet/pods/d31841e6-f09b-46b4-ac72-adf67f6a5327/volumes" Mar 08 00:33:37.172605 master-0 kubenswrapper[23041]: I0308 00:33:37.172514 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:33:37.173421 master-0 kubenswrapper[23041]: I0308 00:33:37.172612 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:33:39.434742 master-0 kubenswrapper[23041]: I0308 00:33:39.434643 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:39.434742 master-0 kubenswrapper[23041]: I0308 00:33:39.434706 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:33:39.436140 master-0 kubenswrapper[23041]: I0308 00:33:39.436084 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:33:39.436301 master-0 kubenswrapper[23041]: I0308 00:33:39.436147 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:33:40.729286 master-0 kubenswrapper[23041]: I0308 00:33:40.729216 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:33:40.729788 master-0 kubenswrapper[23041]: I0308 00:33:40.729569 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://d8889d6936248c826e33628006d790b900bbbcacc9529b4c35a79aa987893d39" gracePeriod=30 Mar 08 00:33:40.729788 master-0 kubenswrapper[23041]: I0308 00:33:40.729592 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://787fa634ee36f327997b592447e9aadba40183c4e7e4d25f5519ae9957121e6e" gracePeriod=30 Mar 08 00:33:40.729788 master-0 kubenswrapper[23041]: I0308 00:33:40.729619 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://0e06c006df1e1e63e0f6188a23b5e393fde4aa4984ad610de00e8c675da914c7" gracePeriod=30 Mar 08 00:33:40.729894 master-0 kubenswrapper[23041]: I0308 00:33:40.729733 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://ea5ec65ba12dfaaa4f58b3b64547a3d98d2937c3aa58a7bc6dc14040003a38a9" gracePeriod=30 Mar 08 00:33:40.729956 master-0 kubenswrapper[23041]: I0308 00:33:40.729651 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://4262f462df3c892c070c1769f302b6c7878bc5f82d5342928245d488b3431f6d" gracePeriod=30 Mar 08 00:33:40.732120 master-0 kubenswrapper[23041]: I0308 00:33:40.732073 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:33:40.732580 master-0 kubenswrapper[23041]: E0308 00:33:40.732544 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 00:33:40.732580 master-0 kubenswrapper[23041]: I0308 00:33:40.732574 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 00:33:40.732671 master-0 kubenswrapper[23041]: E0308 00:33:40.732603 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 00:33:40.732671 master-0 kubenswrapper[23041]: I0308 00:33:40.732616 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 00:33:40.732671 master-0 kubenswrapper[23041]: E0308 00:33:40.732635 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 00:33:40.732671 master-0 kubenswrapper[23041]: I0308 00:33:40.732648 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 00:33:40.732671 master-0 kubenswrapper[23041]: E0308 00:33:40.732671 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: I0308 00:33:40.732686 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: E0308 00:33:40.732713 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: I0308 00:33:40.732725 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: E0308 00:33:40.732743 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: I0308 00:33:40.732755 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: E0308 00:33:40.732779 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: I0308 00:33:40.732791 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: E0308 00:33:40.732810 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: I0308 00:33:40.732824 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 00:33:40.732843 master-0 kubenswrapper[23041]: E0308 00:33:40.732848 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 00:33:40.733124 master-0 kubenswrapper[23041]: I0308 00:33:40.732861 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 00:33:40.733124 master-0 kubenswrapper[23041]: E0308 00:33:40.732879 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31841e6-f09b-46b4-ac72-adf67f6a5327" containerName="console" Mar 08 00:33:40.733124 master-0 kubenswrapper[23041]: I0308 00:33:40.732892 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31841e6-f09b-46b4-ac72-adf67f6a5327" containerName="console" Mar 08 00:33:40.733124 master-0 kubenswrapper[23041]: E0308 00:33:40.732914 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" Mar 08 00:33:40.733124 master-0 kubenswrapper[23041]: I0308 00:33:40.732926 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" Mar 08 00:33:40.733278 master-0 kubenswrapper[23041]: I0308 00:33:40.733162 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 00:33:40.733278 master-0 kubenswrapper[23041]: I0308 00:33:40.733243 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="136f707b-d0a2-41cd-a0d9-bc5ecdbac4ca" containerName="console" Mar 08 00:33:40.733338 master-0 kubenswrapper[23041]: I0308 00:33:40.733295 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="3baca04d-be92-4f02-8ea9-94cc37fc00b4" containerName="console" Mar 08 00:33:40.733338 master-0 kubenswrapper[23041]: I0308 00:33:40.733321 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 00:33:40.733398 master-0 kubenswrapper[23041]: I0308 00:33:40.733344 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 00:33:40.733398 master-0 kubenswrapper[23041]: I0308 00:33:40.733360 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31841e6-f09b-46b4-ac72-adf67f6a5327" containerName="console" Mar 08 00:33:40.733398 master-0 kubenswrapper[23041]: I0308 00:33:40.733380 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 00:33:40.733481 master-0 kubenswrapper[23041]: I0308 00:33:40.733397 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 00:33:40.733481 master-0 kubenswrapper[23041]: I0308 00:33:40.733417 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 00:33:40.733481 master-0 kubenswrapper[23041]: I0308 00:33:40.733433 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 00:33:40.733481 master-0 kubenswrapper[23041]: I0308 00:33:40.733455 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 00:33:40.876578 master-0 kubenswrapper[23041]: I0308 00:33:40.876473 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.876578 master-0 kubenswrapper[23041]: I0308 00:33:40.876528 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.876578 master-0 kubenswrapper[23041]: I0308 00:33:40.876574 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.876829 master-0 kubenswrapper[23041]: I0308 00:33:40.876637 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.876829 master-0 kubenswrapper[23041]: I0308 00:33:40.876657 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.877069 master-0 kubenswrapper[23041]: I0308 00:33:40.876974 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.978712 master-0 kubenswrapper[23041]: I0308 00:33:40.978590 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.978712 master-0 kubenswrapper[23041]: I0308 00:33:40.978711 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.978727 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.978818 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.978901 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.978928 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.979011 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979071 master-0 kubenswrapper[23041]: I0308 00:33:40.979036 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979557 master-0 kubenswrapper[23041]: I0308 00:33:40.979148 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979557 master-0 kubenswrapper[23041]: I0308 00:33:40.979346 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979557 master-0 kubenswrapper[23041]: I0308 00:33:40.979230 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:40.979557 master-0 kubenswrapper[23041]: I0308 00:33:40.979472 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 00:33:41.653067 master-0 kubenswrapper[23041]: I0308 00:33:41.652948 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 00:33:41.654602 master-0 kubenswrapper[23041]: I0308 00:33:41.654556 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 00:33:41.656443 master-0 kubenswrapper[23041]: I0308 00:33:41.656392 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="787fa634ee36f327997b592447e9aadba40183c4e7e4d25f5519ae9957121e6e" exitCode=2 Mar 08 00:33:41.656443 master-0 kubenswrapper[23041]: I0308 00:33:41.656435 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="4262f462df3c892c070c1769f302b6c7878bc5f82d5342928245d488b3431f6d" exitCode=0 Mar 08 00:33:41.656576 master-0 kubenswrapper[23041]: I0308 00:33:41.656450 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="0e06c006df1e1e63e0f6188a23b5e393fde4aa4984ad610de00e8c675da914c7" exitCode=2 Mar 08 00:33:47.173317 master-0 kubenswrapper[23041]: I0308 00:33:47.173057 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:33:47.173317 master-0 kubenswrapper[23041]: I0308 00:33:47.173182 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:33:47.233068 master-0 kubenswrapper[23041]: I0308 00:33:47.232992 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_74512190-22e4-4648-8d1e-e487de48a124/installer/0.log" Mar 08 00:33:47.233357 master-0 kubenswrapper[23041]: I0308 00:33:47.233107 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:33:47.286418 master-0 kubenswrapper[23041]: I0308 00:33:47.286352 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir\") pod \"74512190-22e4-4648-8d1e-e487de48a124\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " Mar 08 00:33:47.286736 master-0 kubenswrapper[23041]: I0308 00:33:47.286487 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock\") pod \"74512190-22e4-4648-8d1e-e487de48a124\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " Mar 08 00:33:47.286736 master-0 kubenswrapper[23041]: I0308 00:33:47.286513 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "74512190-22e4-4648-8d1e-e487de48a124" (UID: "74512190-22e4-4648-8d1e-e487de48a124"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:47.286736 master-0 kubenswrapper[23041]: I0308 00:33:47.286562 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access\") pod \"74512190-22e4-4648-8d1e-e487de48a124\" (UID: \"74512190-22e4-4648-8d1e-e487de48a124\") " Mar 08 00:33:47.286736 master-0 kubenswrapper[23041]: I0308 00:33:47.286641 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock" (OuterVolumeSpecName: "var-lock") pod "74512190-22e4-4648-8d1e-e487de48a124" (UID: "74512190-22e4-4648-8d1e-e487de48a124"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:47.286971 master-0 kubenswrapper[23041]: I0308 00:33:47.286907 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:47.286971 master-0 kubenswrapper[23041]: I0308 00:33:47.286922 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/74512190-22e4-4648-8d1e-e487de48a124-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:47.289415 master-0 kubenswrapper[23041]: I0308 00:33:47.289362 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "74512190-22e4-4648-8d1e-e487de48a124" (UID: "74512190-22e4-4648-8d1e-e487de48a124"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:47.388238 master-0 kubenswrapper[23041]: I0308 00:33:47.388137 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74512190-22e4-4648-8d1e-e487de48a124-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:47.703870 master-0 kubenswrapper[23041]: I0308 00:33:47.703831 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_74512190-22e4-4648-8d1e-e487de48a124/installer/0.log" Mar 08 00:33:47.704089 master-0 kubenswrapper[23041]: I0308 00:33:47.703887 23041 generic.go:334] "Generic (PLEG): container finished" podID="74512190-22e4-4648-8d1e-e487de48a124" containerID="1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794" exitCode=1 Mar 08 00:33:47.704089 master-0 kubenswrapper[23041]: I0308 00:33:47.703919 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"74512190-22e4-4648-8d1e-e487de48a124","Type":"ContainerDied","Data":"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794"} Mar 08 00:33:47.704089 master-0 kubenswrapper[23041]: I0308 00:33:47.703948 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"74512190-22e4-4648-8d1e-e487de48a124","Type":"ContainerDied","Data":"01f93899a9cab6aa4c9b39d2d34505f41a3d828df7d28f0bfd223bbff7cde117"} Mar 08 00:33:47.704089 master-0 kubenswrapper[23041]: I0308 00:33:47.703965 23041 scope.go:117] "RemoveContainer" containerID="1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794" Mar 08 00:33:47.704089 master-0 kubenswrapper[23041]: I0308 00:33:47.704029 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 00:33:47.721127 master-0 kubenswrapper[23041]: I0308 00:33:47.721090 23041 scope.go:117] "RemoveContainer" containerID="1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794" Mar 08 00:33:47.721510 master-0 kubenswrapper[23041]: E0308 00:33:47.721472 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794\": container with ID starting with 1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794 not found: ID does not exist" containerID="1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794" Mar 08 00:33:47.721562 master-0 kubenswrapper[23041]: I0308 00:33:47.721512 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794"} err="failed to get container status \"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794\": rpc error: code = NotFound desc = could not find container \"1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794\": container with ID starting with 1283d7a0be2d499bd67523064c8053d4801413330027591b6177317990231794 not found: ID does not exist" Mar 08 00:33:49.434114 master-0 kubenswrapper[23041]: I0308 00:33:49.434042 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:33:49.434685 master-0 kubenswrapper[23041]: I0308 00:33:49.434126 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:33:54.096270 master-0 kubenswrapper[23041]: I0308 00:33:54.096054 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6dc96f5b89-ctlsc" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" containerID="cri-o://c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6" gracePeriod=15 Mar 08 00:33:54.315043 master-0 kubenswrapper[23041]: E0308 00:33:54.314881 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:33:44Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:33:44Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:33:44Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:33:44Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:33:54.760545 master-0 kubenswrapper[23041]: I0308 00:33:54.760480 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc96f5b89-ctlsc_24264c1b-97df-4311-b7af-b205ac879381/console/0.log" Mar 08 00:33:54.760545 master-0 kubenswrapper[23041]: I0308 00:33:54.760560 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:54.775326 master-0 kubenswrapper[23041]: I0308 00:33:54.775239 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc96f5b89-ctlsc_24264c1b-97df-4311-b7af-b205ac879381/console/0.log" Mar 08 00:33:54.775326 master-0 kubenswrapper[23041]: I0308 00:33:54.775312 23041 generic.go:334] "Generic (PLEG): container finished" podID="24264c1b-97df-4311-b7af-b205ac879381" containerID="c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6" exitCode=2 Mar 08 00:33:54.775719 master-0 kubenswrapper[23041]: I0308 00:33:54.775357 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc96f5b89-ctlsc" event={"ID":"24264c1b-97df-4311-b7af-b205ac879381","Type":"ContainerDied","Data":"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6"} Mar 08 00:33:54.775719 master-0 kubenswrapper[23041]: I0308 00:33:54.775394 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc96f5b89-ctlsc" event={"ID":"24264c1b-97df-4311-b7af-b205ac879381","Type":"ContainerDied","Data":"7733ee7e1853723b50e6187da3137dfb190fcd44e2f676f7946fc9cd120c68b0"} Mar 08 00:33:54.775719 master-0 kubenswrapper[23041]: I0308 00:33:54.775432 23041 scope.go:117] "RemoveContainer" containerID="c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6" Mar 08 00:33:54.775719 master-0 kubenswrapper[23041]: I0308 00:33:54.775593 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc96f5b89-ctlsc" Mar 08 00:33:54.801795 master-0 kubenswrapper[23041]: I0308 00:33:54.801652 23041 scope.go:117] "RemoveContainer" containerID="c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6" Mar 08 00:33:54.803375 master-0 kubenswrapper[23041]: E0308 00:33:54.802789 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6\": container with ID starting with c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6 not found: ID does not exist" containerID="c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6" Mar 08 00:33:54.803375 master-0 kubenswrapper[23041]: I0308 00:33:54.802894 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6"} err="failed to get container status \"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6\": rpc error: code = NotFound desc = could not find container \"c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6\": container with ID starting with c0fa536b5011c760aff3b8c2d8aa8ffc6ebb03ceddd34f83695314b46a416ae6 not found: ID does not exist" Mar 08 00:33:54.877544 master-0 kubenswrapper[23041]: I0308 00:33:54.877469 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.877757 master-0 kubenswrapper[23041]: I0308 00:33:54.877641 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdwt\" (UniqueName: \"kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.877930 master-0 kubenswrapper[23041]: I0308 00:33:54.877884 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.878033 master-0 kubenswrapper[23041]: I0308 00:33:54.877939 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.878788 master-0 kubenswrapper[23041]: I0308 00:33:54.878495 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:54.879054 master-0 kubenswrapper[23041]: I0308 00:33:54.878966 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config" (OuterVolumeSpecName: "console-config") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:54.879054 master-0 kubenswrapper[23041]: I0308 00:33:54.878996 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:54.879419 master-0 kubenswrapper[23041]: I0308 00:33:54.879128 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.879419 master-0 kubenswrapper[23041]: I0308 00:33:54.879236 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.880092 master-0 kubenswrapper[23041]: I0308 00:33:54.880025 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca" (OuterVolumeSpecName: "service-ca") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:33:54.880424 master-0 kubenswrapper[23041]: I0308 00:33:54.880359 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config\") pod \"24264c1b-97df-4311-b7af-b205ac879381\" (UID: \"24264c1b-97df-4311-b7af-b205ac879381\") " Mar 08 00:33:54.881655 master-0 kubenswrapper[23041]: I0308 00:33:54.881589 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.881655 master-0 kubenswrapper[23041]: I0308 00:33:54.881631 23041 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.881655 master-0 kubenswrapper[23041]: I0308 00:33:54.881654 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.881969 master-0 kubenswrapper[23041]: I0308 00:33:54.881674 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24264c1b-97df-4311-b7af-b205ac879381-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.883409 master-0 kubenswrapper[23041]: I0308 00:33:54.883329 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt" (OuterVolumeSpecName: "kube-api-access-wmdwt") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "kube-api-access-wmdwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:54.884527 master-0 kubenswrapper[23041]: I0308 00:33:54.884442 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:54.887727 master-0 kubenswrapper[23041]: I0308 00:33:54.887664 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "24264c1b-97df-4311-b7af-b205ac879381" (UID: "24264c1b-97df-4311-b7af-b205ac879381"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:33:54.984817 master-0 kubenswrapper[23041]: I0308 00:33:54.984606 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.984817 master-0 kubenswrapper[23041]: I0308 00:33:54.984683 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24264c1b-97df-4311-b7af-b205ac879381-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:54.984817 master-0 kubenswrapper[23041]: I0308 00:33:54.984713 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdwt\" (UniqueName: \"kubernetes.io/projected/24264c1b-97df-4311-b7af-b205ac879381-kube-api-access-wmdwt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:55.785978 master-0 kubenswrapper[23041]: I0308 00:33:55.785924 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" containerID="6814cf059b67547841e8687e4684d5c2fadca0471bd82f7879f4d5d53180372c" exitCode=0 Mar 08 00:33:55.786570 master-0 kubenswrapper[23041]: I0308 00:33:55.786008 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b","Type":"ContainerDied","Data":"6814cf059b67547841e8687e4684d5c2fadca0471bd82f7879f4d5d53180372c"} Mar 08 00:33:55.791286 master-0 kubenswrapper[23041]: I0308 00:33:55.791163 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/0.log" Mar 08 00:33:55.791286 master-0 kubenswrapper[23041]: I0308 00:33:55.791274 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="098d17f749452ad5be8665c35010c152a45df97f56bb2ecbb202549c56ee2e8a" exitCode=1 Mar 08 00:33:55.791454 master-0 kubenswrapper[23041]: I0308 00:33:55.791304 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"098d17f749452ad5be8665c35010c152a45df97f56bb2ecbb202549c56ee2e8a"} Mar 08 00:33:55.791905 master-0 kubenswrapper[23041]: I0308 00:33:55.791873 23041 scope.go:117] "RemoveContainer" containerID="098d17f749452ad5be8665c35010c152a45df97f56bb2ecbb202549c56ee2e8a" Mar 08 00:33:56.802569 master-0 kubenswrapper[23041]: I0308 00:33:56.802513 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/0.log" Mar 08 00:33:56.803123 master-0 kubenswrapper[23041]: I0308 00:33:56.802627 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} Mar 08 00:33:56.805170 master-0 kubenswrapper[23041]: I0308 00:33:56.805136 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0af76e72-367d-4d11-8c55-8758aa5003dd/installer/0.log" Mar 08 00:33:56.805291 master-0 kubenswrapper[23041]: I0308 00:33:56.805184 23041 generic.go:334] "Generic (PLEG): container finished" podID="0af76e72-367d-4d11-8c55-8758aa5003dd" containerID="15306c11a0ba862de4c40e0fb25308bcbe28c92520dc173ad31eac204c8ec074" exitCode=1 Mar 08 00:33:56.805353 master-0 kubenswrapper[23041]: I0308 00:33:56.805292 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0af76e72-367d-4d11-8c55-8758aa5003dd","Type":"ContainerDied","Data":"15306c11a0ba862de4c40e0fb25308bcbe28c92520dc173ad31eac204c8ec074"} Mar 08 00:33:57.129919 master-0 kubenswrapper[23041]: I0308 00:33:57.129870 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:57.173240 master-0 kubenswrapper[23041]: I0308 00:33:57.173087 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:33:57.173451 master-0 kubenswrapper[23041]: I0308 00:33:57.173383 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:33:57.328454 master-0 kubenswrapper[23041]: I0308 00:33:57.328262 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access\") pod \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " Mar 08 00:33:57.328739 master-0 kubenswrapper[23041]: I0308 00:33:57.328498 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir\") pod \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " Mar 08 00:33:57.328739 master-0 kubenswrapper[23041]: I0308 00:33:57.328547 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock\") pod \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\" (UID: \"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b\") " Mar 08 00:33:57.328739 master-0 kubenswrapper[23041]: I0308 00:33:57.328614 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" (UID: "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:57.328838 master-0 kubenswrapper[23041]: I0308 00:33:57.328741 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" (UID: "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:57.329455 master-0 kubenswrapper[23041]: I0308 00:33:57.329424 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:57.329531 master-0 kubenswrapper[23041]: I0308 00:33:57.329467 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:57.331962 master-0 kubenswrapper[23041]: I0308 00:33:57.331901 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" (UID: "8e9ee6f7-24ed-44b3-be57-a07a13e9e73b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:57.432027 master-0 kubenswrapper[23041]: I0308 00:33:57.431933 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e9ee6f7-24ed-44b3-be57-a07a13e9e73b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:57.815835 master-0 kubenswrapper[23041]: I0308 00:33:57.815789 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 00:33:57.816362 master-0 kubenswrapper[23041]: I0308 00:33:57.815851 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"8e9ee6f7-24ed-44b3-be57-a07a13e9e73b","Type":"ContainerDied","Data":"28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c"} Mar 08 00:33:57.816362 master-0 kubenswrapper[23041]: I0308 00:33:57.815885 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28341bd2c657921e0a08286179b30ecba623e10a2e08f78b8ef006c2176ea44c" Mar 08 00:33:58.254184 master-0 kubenswrapper[23041]: I0308 00:33:58.254101 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0af76e72-367d-4d11-8c55-8758aa5003dd/installer/0.log" Mar 08 00:33:58.254479 master-0 kubenswrapper[23041]: I0308 00:33:58.254269 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:58.356596 master-0 kubenswrapper[23041]: I0308 00:33:58.356543 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir\") pod \"0af76e72-367d-4d11-8c55-8758aa5003dd\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " Mar 08 00:33:58.356802 master-0 kubenswrapper[23041]: I0308 00:33:58.356765 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access\") pod \"0af76e72-367d-4d11-8c55-8758aa5003dd\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " Mar 08 00:33:58.357074 master-0 kubenswrapper[23041]: I0308 00:33:58.356884 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0af76e72-367d-4d11-8c55-8758aa5003dd" (UID: "0af76e72-367d-4d11-8c55-8758aa5003dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:58.357074 master-0 kubenswrapper[23041]: I0308 00:33:58.356956 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock" (OuterVolumeSpecName: "var-lock") pod "0af76e72-367d-4d11-8c55-8758aa5003dd" (UID: "0af76e72-367d-4d11-8c55-8758aa5003dd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:33:58.357074 master-0 kubenswrapper[23041]: I0308 00:33:58.356921 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock\") pod \"0af76e72-367d-4d11-8c55-8758aa5003dd\" (UID: \"0af76e72-367d-4d11-8c55-8758aa5003dd\") " Mar 08 00:33:58.358379 master-0 kubenswrapper[23041]: I0308 00:33:58.358321 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:58.358379 master-0 kubenswrapper[23041]: I0308 00:33:58.358371 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0af76e72-367d-4d11-8c55-8758aa5003dd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:58.362850 master-0 kubenswrapper[23041]: I0308 00:33:58.362722 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0af76e72-367d-4d11-8c55-8758aa5003dd" (UID: "0af76e72-367d-4d11-8c55-8758aa5003dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:58.460709 master-0 kubenswrapper[23041]: I0308 00:33:58.460530 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0af76e72-367d-4d11-8c55-8758aa5003dd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:33:58.839759 master-0 kubenswrapper[23041]: I0308 00:33:58.839664 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0af76e72-367d-4d11-8c55-8758aa5003dd/installer/0.log" Mar 08 00:33:58.840680 master-0 kubenswrapper[23041]: I0308 00:33:58.839820 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0af76e72-367d-4d11-8c55-8758aa5003dd","Type":"ContainerDied","Data":"da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4"} Mar 08 00:33:58.840680 master-0 kubenswrapper[23041]: I0308 00:33:58.839867 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2c9f6dd60842bc9377058f105ce0e3313e3bda92138b2bc0e885da2aabb6f4" Mar 08 00:33:58.840680 master-0 kubenswrapper[23041]: I0308 00:33:58.839969 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 00:33:59.434965 master-0 kubenswrapper[23041]: I0308 00:33:59.434873 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:33:59.435217 master-0 kubenswrapper[23041]: I0308 00:33:59.435061 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:33:59.870580 master-0 kubenswrapper[23041]: E0308 00:33:59.870463 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:00.861588 master-0 kubenswrapper[23041]: I0308 00:34:00.861525 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_861ba34f-5174-4835-a9b9-dbc5eacd2963/installer/0.log" Mar 08 00:34:00.861588 master-0 kubenswrapper[23041]: I0308 00:34:00.861587 23041 generic.go:334] "Generic (PLEG): container finished" podID="861ba34f-5174-4835-a9b9-dbc5eacd2963" containerID="ad59f0ee4ace09dae79cfc40c750720203b39cdfecc33e32dfaa1834966aad3c" exitCode=1 Mar 08 00:34:00.861955 master-0 kubenswrapper[23041]: I0308 00:34:00.861630 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"861ba34f-5174-4835-a9b9-dbc5eacd2963","Type":"ContainerDied","Data":"ad59f0ee4ace09dae79cfc40c750720203b39cdfecc33e32dfaa1834966aad3c"} Mar 08 00:34:01.309591 master-0 kubenswrapper[23041]: I0308 00:34:01.309504 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"metrics-server-6474759988-dnw4m\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:34:01.310235 master-0 kubenswrapper[23041]: E0308 00:34:01.309778 23041 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-ffspe3f0nbfal: secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:34:01.310235 master-0 kubenswrapper[23041]: E0308 00:34:01.309899 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle podName:0101c4ce-fd58-4ddb-94f7-abb8b2293cdb nodeName:}" failed. No retries permitted until 2026-03-08 00:36:03.309869274 +0000 UTC m=+268.782705848 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle") pod "metrics-server-6474759988-dnw4m" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb") : secret "metrics-server-ffspe3f0nbfal" not found Mar 08 00:34:02.343934 master-0 kubenswrapper[23041]: I0308 00:34:02.343884 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_861ba34f-5174-4835-a9b9-dbc5eacd2963/installer/0.log" Mar 08 00:34:02.344625 master-0 kubenswrapper[23041]: I0308 00:34:02.343957 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:34:02.440226 master-0 kubenswrapper[23041]: I0308 00:34:02.440115 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir\") pod \"861ba34f-5174-4835-a9b9-dbc5eacd2963\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " Mar 08 00:34:02.440458 master-0 kubenswrapper[23041]: I0308 00:34:02.440342 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "861ba34f-5174-4835-a9b9-dbc5eacd2963" (UID: "861ba34f-5174-4835-a9b9-dbc5eacd2963"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:02.440458 master-0 kubenswrapper[23041]: I0308 00:34:02.440380 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock\") pod \"861ba34f-5174-4835-a9b9-dbc5eacd2963\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " Mar 08 00:34:02.440564 master-0 kubenswrapper[23041]: I0308 00:34:02.440449 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock" (OuterVolumeSpecName: "var-lock") pod "861ba34f-5174-4835-a9b9-dbc5eacd2963" (UID: "861ba34f-5174-4835-a9b9-dbc5eacd2963"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:02.440630 master-0 kubenswrapper[23041]: I0308 00:34:02.440551 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access\") pod \"861ba34f-5174-4835-a9b9-dbc5eacd2963\" (UID: \"861ba34f-5174-4835-a9b9-dbc5eacd2963\") " Mar 08 00:34:02.441923 master-0 kubenswrapper[23041]: I0308 00:34:02.441863 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:02.441991 master-0 kubenswrapper[23041]: I0308 00:34:02.441923 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861ba34f-5174-4835-a9b9-dbc5eacd2963-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:02.443295 master-0 kubenswrapper[23041]: I0308 00:34:02.443243 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "861ba34f-5174-4835-a9b9-dbc5eacd2963" (UID: "861ba34f-5174-4835-a9b9-dbc5eacd2963"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:34:02.544225 master-0 kubenswrapper[23041]: I0308 00:34:02.544050 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861ba34f-5174-4835-a9b9-dbc5eacd2963-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:02.692596 master-0 kubenswrapper[23041]: I0308 00:34:02.692524 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:34:02.692978 master-0 kubenswrapper[23041]: I0308 00:34:02.692914 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:34:02.693095 master-0 kubenswrapper[23041]: I0308 00:34:02.693058 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 08 00:34:02.693257 master-0 kubenswrapper[23041]: I0308 00:34:02.693106 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 08 00:34:02.885988 master-0 kubenswrapper[23041]: I0308 00:34:02.885871 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_861ba34f-5174-4835-a9b9-dbc5eacd2963/installer/0.log" Mar 08 00:34:02.886422 master-0 kubenswrapper[23041]: I0308 00:34:02.886371 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"861ba34f-5174-4835-a9b9-dbc5eacd2963","Type":"ContainerDied","Data":"63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741"} Mar 08 00:34:02.886604 master-0 kubenswrapper[23041]: I0308 00:34:02.886575 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63b07cea28b6768a9651b5f58b996dd7f6e9fc810bdd80305db2f53213887741" Mar 08 00:34:02.886808 master-0 kubenswrapper[23041]: I0308 00:34:02.886591 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 00:34:04.315967 master-0 kubenswrapper[23041]: E0308 00:34:04.315885 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:05.712627 master-0 kubenswrapper[23041]: I0308 00:34:05.712541 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:34:05.763036 master-0 kubenswrapper[23041]: I0308 00:34:05.762952 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:34:05.956046 master-0 kubenswrapper[23041]: I0308 00:34:05.955984 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 00:34:07.172779 master-0 kubenswrapper[23041]: I0308 00:34:07.172705 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:07.173635 master-0 kubenswrapper[23041]: I0308 00:34:07.172795 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:09.435125 master-0 kubenswrapper[23041]: I0308 00:34:09.435060 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:09.435644 master-0 kubenswrapper[23041]: I0308 00:34:09.435127 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:34:09.871754 master-0 kubenswrapper[23041]: E0308 00:34:09.871652 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:10.959380 master-0 kubenswrapper[23041]: I0308 00:34:10.959327 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 00:34:10.960317 master-0 kubenswrapper[23041]: I0308 00:34:10.960288 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 00:34:10.960933 master-0 kubenswrapper[23041]: I0308 00:34:10.960888 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 00:34:10.961494 master-0 kubenswrapper[23041]: I0308 00:34:10.961467 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 00:34:10.962567 master-0 kubenswrapper[23041]: I0308 00:34:10.962529 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="ea5ec65ba12dfaaa4f58b3b64547a3d98d2937c3aa58a7bc6dc14040003a38a9" exitCode=137 Mar 08 00:34:10.962567 master-0 kubenswrapper[23041]: I0308 00:34:10.962559 23041 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="d8889d6936248c826e33628006d790b900bbbcacc9529b4c35a79aa987893d39" exitCode=137 Mar 08 00:34:11.349342 master-0 kubenswrapper[23041]: I0308 00:34:11.349290 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 00:34:11.350352 master-0 kubenswrapper[23041]: I0308 00:34:11.350301 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 00:34:11.350883 master-0 kubenswrapper[23041]: I0308 00:34:11.350858 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 00:34:11.351296 master-0 kubenswrapper[23041]: I0308 00:34:11.351265 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 00:34:11.353337 master-0 kubenswrapper[23041]: I0308 00:34:11.352715 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:34:11.429506 master-0 kubenswrapper[23041]: I0308 00:34:11.429420 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.429506 master-0 kubenswrapper[23041]: I0308 00:34:11.429472 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.429506 master-0 kubenswrapper[23041]: I0308 00:34:11.429512 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429554 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429606 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429650 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429692 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429714 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429692 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429745 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.429872 master-0 kubenswrapper[23041]: I0308 00:34:11.429843 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 00:34:11.430144 master-0 kubenswrapper[23041]: I0308 00:34:11.429920 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:11.430783 master-0 kubenswrapper[23041]: I0308 00:34:11.430741 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.430783 master-0 kubenswrapper[23041]: I0308 00:34:11.430779 23041 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.430894 master-0 kubenswrapper[23041]: I0308 00:34:11.430799 23041 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.430894 master-0 kubenswrapper[23041]: I0308 00:34:11.430828 23041 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.430894 master-0 kubenswrapper[23041]: I0308 00:34:11.430848 23041 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.430894 master-0 kubenswrapper[23041]: I0308 00:34:11.430865 23041 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:11.979483 master-0 kubenswrapper[23041]: I0308 00:34:11.979358 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 00:34:11.980762 master-0 kubenswrapper[23041]: I0308 00:34:11.980723 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 00:34:11.981488 master-0 kubenswrapper[23041]: I0308 00:34:11.981448 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 00:34:11.981917 master-0 kubenswrapper[23041]: I0308 00:34:11.981888 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 00:34:11.983070 master-0 kubenswrapper[23041]: I0308 00:34:11.983043 23041 scope.go:117] "RemoveContainer" containerID="787fa634ee36f327997b592447e9aadba40183c4e7e4d25f5519ae9957121e6e" Mar 08 00:34:11.983215 master-0 kubenswrapper[23041]: I0308 00:34:11.983176 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:34:12.003120 master-0 kubenswrapper[23041]: I0308 00:34:12.003069 23041 scope.go:117] "RemoveContainer" containerID="4262f462df3c892c070c1769f302b6c7878bc5f82d5342928245d488b3431f6d" Mar 08 00:34:12.027712 master-0 kubenswrapper[23041]: I0308 00:34:12.027688 23041 scope.go:117] "RemoveContainer" containerID="0e06c006df1e1e63e0f6188a23b5e393fde4aa4984ad610de00e8c675da914c7" Mar 08 00:34:12.056794 master-0 kubenswrapper[23041]: I0308 00:34:12.056732 23041 scope.go:117] "RemoveContainer" containerID="ea5ec65ba12dfaaa4f58b3b64547a3d98d2937c3aa58a7bc6dc14040003a38a9" Mar 08 00:34:12.085815 master-0 kubenswrapper[23041]: I0308 00:34:12.085776 23041 scope.go:117] "RemoveContainer" containerID="d8889d6936248c826e33628006d790b900bbbcacc9529b4c35a79aa987893d39" Mar 08 00:34:12.107762 master-0 kubenswrapper[23041]: I0308 00:34:12.107724 23041 scope.go:117] "RemoveContainer" containerID="620aae0686e0d0747f86c66dccb5f833f425852d851da5976e803bb0ce3011ba" Mar 08 00:34:12.136362 master-0 kubenswrapper[23041]: I0308 00:34:12.136312 23041 scope.go:117] "RemoveContainer" containerID="c8de3ced39581b8ad5acd40157b9e893206291d5fd34e7516c2c1b0358ea17a6" Mar 08 00:34:12.161347 master-0 kubenswrapper[23041]: I0308 00:34:12.161287 23041 scope.go:117] "RemoveContainer" containerID="182e67e6b82b83c4d47d4c01d3dcbdede2056c9bcdcf8367c8a6959d0eeac8ea" Mar 08 00:34:12.693072 master-0 kubenswrapper[23041]: I0308 00:34:12.692953 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 08 00:34:12.693410 master-0 kubenswrapper[23041]: I0308 00:34:12.693066 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 08 00:34:12.820367 master-0 kubenswrapper[23041]: I0308 00:34:12.820285 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 08 00:34:14.316701 master-0 kubenswrapper[23041]: E0308 00:34:14.316606 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:14.754130 master-0 kubenswrapper[23041]: E0308 00:34:14.753949 23041 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ab675e1363665 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:33:40.729566821 +0000 UTC m=+126.202403395,LastTimestamp:2026-03-08 00:33:40.729566821 +0000 UTC m=+126.202403395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:34:17.172668 master-0 kubenswrapper[23041]: I0308 00:34:17.172539 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:17.173810 master-0 kubenswrapper[23041]: I0308 00:34:17.172683 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:19.443235 master-0 kubenswrapper[23041]: I0308 00:34:19.439479 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:19.443235 master-0 kubenswrapper[23041]: I0308 00:34:19.439576 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:34:19.808426 master-0 kubenswrapper[23041]: I0308 00:34:19.808331 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:34:19.826385 master-0 kubenswrapper[23041]: I0308 00:34:19.826331 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:34:19.826385 master-0 kubenswrapper[23041]: I0308 00:34:19.826380 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:34:19.872231 master-0 kubenswrapper[23041]: E0308 00:34:19.872134 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:22.645673 master-0 kubenswrapper[23041]: I0308 00:34:22.645625 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:34:22.693564 master-0 kubenswrapper[23041]: I0308 00:34:22.693467 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 08 00:34:22.693564 master-0 kubenswrapper[23041]: I0308 00:34:22.693545 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 08 00:34:22.693880 master-0 kubenswrapper[23041]: I0308 00:34:22.693604 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:34:22.694503 master-0 kubenswrapper[23041]: I0308 00:34:22.694458 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 08 00:34:22.694625 master-0 kubenswrapper[23041]: I0308 00:34:22.694588 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" containerID="cri-o://9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" gracePeriod=30 Mar 08 00:34:22.756146 master-0 kubenswrapper[23041]: I0308 00:34:22.756080 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.756762 master-0 kubenswrapper[23041]: I0308 00:34:22.756664 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.756885 master-0 kubenswrapper[23041]: I0308 00:34:22.756641 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:34:22.756993 master-0 kubenswrapper[23041]: I0308 00:34:22.756956 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.757100 master-0 kubenswrapper[23041]: I0308 00:34:22.757061 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.757360 master-0 kubenswrapper[23041]: I0308 00:34:22.757323 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.757419 master-0 kubenswrapper[23041]: I0308 00:34:22.757398 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.757506 master-0 kubenswrapper[23041]: I0308 00:34:22.757480 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") pod \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\" (UID: \"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb\") " Mar 08 00:34:22.757884 master-0 kubenswrapper[23041]: I0308 00:34:22.757623 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log" (OuterVolumeSpecName: "audit-log") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:34:22.757944 master-0 kubenswrapper[23041]: I0308 00:34:22.757891 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:34:22.758293 master-0 kubenswrapper[23041]: I0308 00:34:22.758263 23041 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.758364 master-0 kubenswrapper[23041]: I0308 00:34:22.758300 23041 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-audit-log\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.758364 master-0 kubenswrapper[23041]: I0308 00:34:22.758323 23041 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.760608 master-0 kubenswrapper[23041]: I0308 00:34:22.760576 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:34:22.760713 master-0 kubenswrapper[23041]: I0308 00:34:22.760676 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq" (OuterVolumeSpecName: "kube-api-access-b66xq") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "kube-api-access-b66xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:34:22.761048 master-0 kubenswrapper[23041]: I0308 00:34:22.761016 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:34:22.761393 master-0 kubenswrapper[23041]: I0308 00:34:22.761337 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" (UID: "0101c4ce-fd58-4ddb-94f7-abb8b2293cdb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:34:22.861056 master-0 kubenswrapper[23041]: I0308 00:34:22.860914 23041 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.861056 master-0 kubenswrapper[23041]: I0308 00:34:22.860996 23041 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.861056 master-0 kubenswrapper[23041]: I0308 00:34:22.861018 23041 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:22.861056 master-0 kubenswrapper[23041]: I0308 00:34:22.861032 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66xq\" (UniqueName: \"kubernetes.io/projected/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb-kube-api-access-b66xq\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:23.065450 master-0 kubenswrapper[23041]: I0308 00:34:23.065373 23041 generic.go:334] "Generic (PLEG): container finished" podID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" containerID="d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c" exitCode=0 Mar 08 00:34:23.065450 master-0 kubenswrapper[23041]: I0308 00:34:23.065431 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerDied","Data":"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c"} Mar 08 00:34:23.065739 master-0 kubenswrapper[23041]: I0308 00:34:23.065480 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" event={"ID":"0101c4ce-fd58-4ddb-94f7-abb8b2293cdb","Type":"ContainerDied","Data":"e690a192a3d0aa0e87e9cbde66640402b6c73d23b93fc09f09a46f66f560f7c6"} Mar 08 00:34:23.065739 master-0 kubenswrapper[23041]: I0308 00:34:23.065491 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6474759988-dnw4m" Mar 08 00:34:23.065739 master-0 kubenswrapper[23041]: I0308 00:34:23.065522 23041 scope.go:117] "RemoveContainer" containerID="d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c" Mar 08 00:34:23.092217 master-0 kubenswrapper[23041]: I0308 00:34:23.092169 23041 scope.go:117] "RemoveContainer" containerID="d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c" Mar 08 00:34:23.092743 master-0 kubenswrapper[23041]: E0308 00:34:23.092696 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c\": container with ID starting with d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c not found: ID does not exist" containerID="d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c" Mar 08 00:34:23.092801 master-0 kubenswrapper[23041]: I0308 00:34:23.092748 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c"} err="failed to get container status \"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c\": rpc error: code = NotFound desc = could not find container \"d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c\": container with ID starting with d10ba8d248cc13e58fc18237bf3fc8704307376acdb97eeeff019b2173aa233c not found: ID does not exist" Mar 08 00:34:24.317000 master-0 kubenswrapper[23041]: E0308 00:34:24.316922 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:27.172682 master-0 kubenswrapper[23041]: I0308 00:34:27.172578 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:27.173522 master-0 kubenswrapper[23041]: I0308 00:34:27.172703 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:29.435022 master-0 kubenswrapper[23041]: I0308 00:34:29.434919 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:29.436106 master-0 kubenswrapper[23041]: I0308 00:34:29.435043 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:34:29.872950 master-0 kubenswrapper[23041]: E0308 00:34:29.872819 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 08 00:34:34.318987 master-0 kubenswrapper[23041]: E0308 00:34:34.318852 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:34.318987 master-0 kubenswrapper[23041]: E0308 00:34:34.318978 23041 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:34:37.174156 master-0 kubenswrapper[23041]: I0308 00:34:37.173983 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:37.175293 master-0 kubenswrapper[23041]: I0308 00:34:37.174165 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:39.434802 master-0 kubenswrapper[23041]: I0308 00:34:39.434689 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:39.434802 master-0 kubenswrapper[23041]: I0308 00:34:39.434788 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:34:39.874019 master-0 kubenswrapper[23041]: E0308 00:34:39.873941 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:39.874019 master-0 kubenswrapper[23041]: I0308 00:34:39.874013 23041 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 00:34:44.270683 master-0 kubenswrapper[23041]: I0308 00:34:44.270630 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/1.log" Mar 08 00:34:44.271845 master-0 kubenswrapper[23041]: I0308 00:34:44.271640 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/0.log" Mar 08 00:34:44.272261 master-0 kubenswrapper[23041]: I0308 00:34:44.272219 23041 generic.go:334] "Generic (PLEG): container finished" podID="af391724-079a-4bac-a89e-978ffd471763" containerID="171aa9f17bab1693340df88dc9687b17839bec3452bff1e75aeedd920e40b060" exitCode=1 Mar 08 00:34:44.272373 master-0 kubenswrapper[23041]: I0308 00:34:44.272272 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerDied","Data":"171aa9f17bab1693340df88dc9687b17839bec3452bff1e75aeedd920e40b060"} Mar 08 00:34:44.272373 master-0 kubenswrapper[23041]: I0308 00:34:44.272362 23041 scope.go:117] "RemoveContainer" containerID="c9e6fa5d3ccf4015c27e14ffdb2578ad6435947b5bdd16e602ffdf86284246dc" Mar 08 00:34:44.273450 master-0 kubenswrapper[23041]: I0308 00:34:44.273416 23041 scope.go:117] "RemoveContainer" containerID="171aa9f17bab1693340df88dc9687b17839bec3452bff1e75aeedd920e40b060" Mar 08 00:34:45.286411 master-0 kubenswrapper[23041]: I0308 00:34:45.286302 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-m7549_af391724-079a-4bac-a89e-978ffd471763/approver/1.log" Mar 08 00:34:45.287644 master-0 kubenswrapper[23041]: I0308 00:34:45.287572 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-m7549" event={"ID":"af391724-079a-4bac-a89e-978ffd471763","Type":"ContainerStarted","Data":"f55af6ca99cfe1adb5584eeb0fe053544ddd9e14447603507e8f9aa028df5eda"} Mar 08 00:34:47.172920 master-0 kubenswrapper[23041]: I0308 00:34:47.172845 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:47.173419 master-0 kubenswrapper[23041]: I0308 00:34:47.172940 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:47.235308 master-0 kubenswrapper[23041]: I0308 00:34:47.235154 23041 status_manager.go:851] "Failed to get status for pod" podUID="74512190-22e4-4648-8d1e-e487de48a124" pod="openshift-kube-apiserver/installer-4-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-4-master-0)" Mar 08 00:34:48.756951 master-0 kubenswrapper[23041]: E0308 00:34:48.756776 23041 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ab675e136d2ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:33:40.729606862 +0000 UTC m=+126.202443416,LastTimestamp:2026-03-08 00:33:40.729606862 +0000 UTC m=+126.202443416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:34:49.435113 master-0 kubenswrapper[23041]: I0308 00:34:49.435035 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:49.435113 master-0 kubenswrapper[23041]: I0308 00:34:49.435106 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:34:49.874754 master-0 kubenswrapper[23041]: E0308 00:34:49.874453 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 08 00:34:53.350350 master-0 kubenswrapper[23041]: I0308 00:34:53.350252 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:34:53.352237 master-0 kubenswrapper[23041]: I0308 00:34:53.352190 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/0.log" Mar 08 00:34:53.352316 master-0 kubenswrapper[23041]: I0308 00:34:53.352262 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" exitCode=137 Mar 08 00:34:53.352316 master-0 kubenswrapper[23041]: I0308 00:34:53.352306 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} Mar 08 00:34:53.352403 master-0 kubenswrapper[23041]: I0308 00:34:53.352348 23041 scope.go:117] "RemoveContainer" containerID="098d17f749452ad5be8665c35010c152a45df97f56bb2ecbb202549c56ee2e8a" Mar 08 00:34:53.829376 master-0 kubenswrapper[23041]: E0308 00:34:53.829280 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:34:53.830037 master-0 kubenswrapper[23041]: I0308 00:34:53.829992 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 00:34:53.847423 master-0 kubenswrapper[23041]: W0308 00:34:53.847240 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-814fee410e02db1c6f790cdfa122ca4cb208b42be7c6a93153bf9ad51d0e0858 WatchSource:0}: Error finding container 814fee410e02db1c6f790cdfa122ca4cb208b42be7c6a93153bf9ad51d0e0858: Status 404 returned error can't find the container with id 814fee410e02db1c6f790cdfa122ca4cb208b42be7c6a93153bf9ad51d0e0858 Mar 08 00:34:54.361103 master-0 kubenswrapper[23041]: I0308 00:34:54.360957 23041 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="3eb03ddb5a023eb7ff60db78bad504fcf235bebe50e93a4860c7db00ad80c243" exitCode=0 Mar 08 00:34:54.361103 master-0 kubenswrapper[23041]: I0308 00:34:54.361060 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"3eb03ddb5a023eb7ff60db78bad504fcf235bebe50e93a4860c7db00ad80c243"} Mar 08 00:34:54.361643 master-0 kubenswrapper[23041]: I0308 00:34:54.361128 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"814fee410e02db1c6f790cdfa122ca4cb208b42be7c6a93153bf9ad51d0e0858"} Mar 08 00:34:54.361643 master-0 kubenswrapper[23041]: I0308 00:34:54.361496 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:34:54.361643 master-0 kubenswrapper[23041]: I0308 00:34:54.361518 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:34:54.362816 master-0 kubenswrapper[23041]: I0308 00:34:54.362800 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:34:54.363693 master-0 kubenswrapper[23041]: I0308 00:34:54.363666 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b"} Mar 08 00:34:54.521475 master-0 kubenswrapper[23041]: E0308 00:34:54.521401 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:34:44Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:34:44Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:34:44Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:34:44Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:34:57.173591 master-0 kubenswrapper[23041]: I0308 00:34:57.173420 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:34:57.173591 master-0 kubenswrapper[23041]: I0308 00:34:57.173485 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:34:57.399652 master-0 kubenswrapper[23041]: I0308 00:34:57.399597 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_343c30a5-7bf7-49ef-a224-c39ca46a63f1/installer/0.log" Mar 08 00:34:57.399824 master-0 kubenswrapper[23041]: I0308 00:34:57.399679 23041 generic.go:334] "Generic (PLEG): container finished" podID="343c30a5-7bf7-49ef-a224-c39ca46a63f1" containerID="42e56510331a27f30c67dafb2ca2fefb858a01f489046e8ecd0c02cc5211b70c" exitCode=1 Mar 08 00:34:57.399824 master-0 kubenswrapper[23041]: I0308 00:34:57.399733 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"343c30a5-7bf7-49ef-a224-c39ca46a63f1","Type":"ContainerDied","Data":"42e56510331a27f30c67dafb2ca2fefb858a01f489046e8ecd0c02cc5211b70c"} Mar 08 00:34:58.714424 master-0 kubenswrapper[23041]: I0308 00:34:58.714400 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_343c30a5-7bf7-49ef-a224-c39ca46a63f1/installer/0.log" Mar 08 00:34:58.714842 master-0 kubenswrapper[23041]: I0308 00:34:58.714827 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:34:58.823567 master-0 kubenswrapper[23041]: I0308 00:34:58.823483 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir\") pod \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " Mar 08 00:34:58.823787 master-0 kubenswrapper[23041]: I0308 00:34:58.823586 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "343c30a5-7bf7-49ef-a224-c39ca46a63f1" (UID: "343c30a5-7bf7-49ef-a224-c39ca46a63f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:58.823787 master-0 kubenswrapper[23041]: I0308 00:34:58.823654 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock\") pod \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " Mar 08 00:34:58.823787 master-0 kubenswrapper[23041]: I0308 00:34:58.823755 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access\") pod \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\" (UID: \"343c30a5-7bf7-49ef-a224-c39ca46a63f1\") " Mar 08 00:34:58.823940 master-0 kubenswrapper[23041]: I0308 00:34:58.823921 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock" (OuterVolumeSpecName: "var-lock") pod "343c30a5-7bf7-49ef-a224-c39ca46a63f1" (UID: "343c30a5-7bf7-49ef-a224-c39ca46a63f1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:34:58.824501 master-0 kubenswrapper[23041]: I0308 00:34:58.824456 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:58.824582 master-0 kubenswrapper[23041]: I0308 00:34:58.824498 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343c30a5-7bf7-49ef-a224-c39ca46a63f1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:58.826617 master-0 kubenswrapper[23041]: I0308 00:34:58.826582 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "343c30a5-7bf7-49ef-a224-c39ca46a63f1" (UID: "343c30a5-7bf7-49ef-a224-c39ca46a63f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:34:58.925497 master-0 kubenswrapper[23041]: I0308 00:34:58.925353 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343c30a5-7bf7-49ef-a224-c39ca46a63f1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:34:59.418740 master-0 kubenswrapper[23041]: I0308 00:34:59.418402 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_343c30a5-7bf7-49ef-a224-c39ca46a63f1/installer/0.log" Mar 08 00:34:59.418740 master-0 kubenswrapper[23041]: I0308 00:34:59.418490 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"343c30a5-7bf7-49ef-a224-c39ca46a63f1","Type":"ContainerDied","Data":"b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6"} Mar 08 00:34:59.418740 master-0 kubenswrapper[23041]: I0308 00:34:59.418526 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39372ca5af916b898b90c0ac5bf26e4d523079274d09ad839e405b5d3212ca6" Mar 08 00:34:59.418740 master-0 kubenswrapper[23041]: I0308 00:34:59.418619 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 00:34:59.434671 master-0 kubenswrapper[23041]: I0308 00:34:59.434573 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:34:59.434671 master-0 kubenswrapper[23041]: I0308 00:34:59.434667 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:00.076180 master-0 kubenswrapper[23041]: E0308 00:35:00.075926 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 08 00:35:02.693660 master-0 kubenswrapper[23041]: I0308 00:35:02.693554 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:35:02.694219 master-0 kubenswrapper[23041]: I0308 00:35:02.693716 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:35:02.702348 master-0 kubenswrapper[23041]: I0308 00:35:02.702301 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:35:03.464829 master-0 kubenswrapper[23041]: I0308 00:35:03.464766 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:35:04.522557 master-0 kubenswrapper[23041]: E0308 00:35:04.522329 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:35:07.172997 master-0 kubenswrapper[23041]: I0308 00:35:07.172928 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:07.174067 master-0 kubenswrapper[23041]: I0308 00:35:07.174007 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:09.434845 master-0 kubenswrapper[23041]: I0308 00:35:09.434737 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:09.436190 master-0 kubenswrapper[23041]: I0308 00:35:09.434847 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:10.477036 master-0 kubenswrapper[23041]: E0308 00:35:10.476923 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 08 00:35:14.523725 master-0 kubenswrapper[23041]: E0308 00:35:14.523541 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:35:17.172933 master-0 kubenswrapper[23041]: I0308 00:35:17.172836 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:17.172933 master-0 kubenswrapper[23041]: I0308 00:35:17.172906 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:18.583159 master-0 kubenswrapper[23041]: I0308 00:35:18.583054 23041 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a2ef-2498-40a0-a189-0753076fd3b6" containerID="9640b5a39ba1c8d22970de560d1644963302e95dae8ebd4e31dc3deaa2d4d495" exitCode=0 Mar 08 00:35:18.583665 master-0 kubenswrapper[23041]: I0308 00:35:18.583161 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerDied","Data":"9640b5a39ba1c8d22970de560d1644963302e95dae8ebd4e31dc3deaa2d4d495"} Mar 08 00:35:18.585086 master-0 kubenswrapper[23041]: I0308 00:35:18.585027 23041 scope.go:117] "RemoveContainer" containerID="9640b5a39ba1c8d22970de560d1644963302e95dae8ebd4e31dc3deaa2d4d495" Mar 08 00:35:18.587445 master-0 kubenswrapper[23041]: I0308 00:35:18.587395 23041 scope.go:117] "RemoveContainer" containerID="04817105ab63ed3d02352e545fc19277b913254d7947d42a71d84846748fcfc3" Mar 08 00:35:19.435579 master-0 kubenswrapper[23041]: I0308 00:35:19.435513 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:19.436064 master-0 kubenswrapper[23041]: I0308 00:35:19.435984 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:19.597694 master-0 kubenswrapper[23041]: I0308 00:35:19.597584 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" event={"ID":"5cf5a2ef-2498-40a0-a189-0753076fd3b6","Type":"ContainerStarted","Data":"93eb8cac6bce4de014f702bbc389c7a4efeb26811e1ff4d166864777456a345a"} Mar 08 00:35:19.598748 master-0 kubenswrapper[23041]: I0308 00:35:19.598700 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:35:19.600453 master-0 kubenswrapper[23041]: I0308 00:35:19.600429 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-mgb5v" Mar 08 00:35:21.279914 master-0 kubenswrapper[23041]: E0308 00:35:21.279451 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 08 00:35:22.759129 master-0 kubenswrapper[23041]: E0308 00:35:22.758985 23041 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ab675e1376b59 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:33:40.729645913 +0000 UTC m=+126.202482467,LastTimestamp:2026-03-08 00:33:40.729645913 +0000 UTC m=+126.202482467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:35:24.524786 master-0 kubenswrapper[23041]: E0308 00:35:24.524658 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:35:27.172616 master-0 kubenswrapper[23041]: I0308 00:35:27.172545 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:27.173533 master-0 kubenswrapper[23041]: I0308 00:35:27.173422 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:28.364797 master-0 kubenswrapper[23041]: E0308 00:35:28.364742 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:35:29.435166 master-0 kubenswrapper[23041]: I0308 00:35:29.435078 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:29.435913 master-0 kubenswrapper[23041]: I0308 00:35:29.435482 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:29.676851 master-0 kubenswrapper[23041]: I0308 00:35:29.676762 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/2.log" Mar 08 00:35:29.677599 master-0 kubenswrapper[23041]: I0308 00:35:29.677575 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/1.log" Mar 08 00:35:29.677742 master-0 kubenswrapper[23041]: I0308 00:35:29.677622 23041 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198" exitCode=1 Mar 08 00:35:29.677742 master-0 kubenswrapper[23041]: I0308 00:35:29.677696 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198"} Mar 08 00:35:29.677742 master-0 kubenswrapper[23041]: I0308 00:35:29.677738 23041 scope.go:117] "RemoveContainer" containerID="f8579510b3d4eb37fa166a47f1175d9203069f85aea52cc88554ccc7a9077266" Mar 08 00:35:29.678893 master-0 kubenswrapper[23041]: I0308 00:35:29.678701 23041 scope.go:117] "RemoveContainer" containerID="12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198" Mar 08 00:35:29.681909 master-0 kubenswrapper[23041]: I0308 00:35:29.681870 23041 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="37f62b58f940c78c4dcc399cde41a0ed9c917211b8d4a3889a78626918547185" exitCode=0 Mar 08 00:35:29.682002 master-0 kubenswrapper[23041]: I0308 00:35:29.681923 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"37f62b58f940c78c4dcc399cde41a0ed9c917211b8d4a3889a78626918547185"} Mar 08 00:35:29.682491 master-0 kubenswrapper[23041]: I0308 00:35:29.682453 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:35:29.682491 master-0 kubenswrapper[23041]: I0308 00:35:29.682480 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:35:30.694856 master-0 kubenswrapper[23041]: I0308 00:35:30.694790 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/2.log" Mar 08 00:35:30.694856 master-0 kubenswrapper[23041]: I0308 00:35:30.694860 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114"} Mar 08 00:35:32.881101 master-0 kubenswrapper[23041]: E0308 00:35:32.880969 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 08 00:35:33.732459 master-0 kubenswrapper[23041]: I0308 00:35:33.732388 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/config-sync-controllers/0.log" Mar 08 00:35:33.733225 master-0 kubenswrapper[23041]: I0308 00:35:33.733133 23041 generic.go:334] "Generic (PLEG): container finished" podID="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" containerID="6703d449ef58e82f6711f4fb4077c407ce4e8f1fc186664220b3722e268d3aa7" exitCode=1 Mar 08 00:35:33.733309 master-0 kubenswrapper[23041]: I0308 00:35:33.733269 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerDied","Data":"6703d449ef58e82f6711f4fb4077c407ce4e8f1fc186664220b3722e268d3aa7"} Mar 08 00:35:33.734583 master-0 kubenswrapper[23041]: I0308 00:35:33.734545 23041 scope.go:117] "RemoveContainer" containerID="6703d449ef58e82f6711f4fb4077c407ce4e8f1fc186664220b3722e268d3aa7" Mar 08 00:35:34.526235 master-0 kubenswrapper[23041]: E0308 00:35:34.526129 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:35:34.527185 master-0 kubenswrapper[23041]: E0308 00:35:34.527150 23041 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:35:34.751527 master-0 kubenswrapper[23041]: I0308 00:35:34.751458 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/config-sync-controllers/0.log" Mar 08 00:35:34.752098 master-0 kubenswrapper[23041]: I0308 00:35:34.752054 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"7b52440df031400684165c964ab52b42111d216c816f88a45c5f3ed3841c4d5e"} Mar 08 00:35:37.173436 master-0 kubenswrapper[23041]: I0308 00:35:37.173332 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:37.174414 master-0 kubenswrapper[23041]: I0308 00:35:37.173465 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:39.434892 master-0 kubenswrapper[23041]: I0308 00:35:39.434797 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:39.434892 master-0 kubenswrapper[23041]: I0308 00:35:39.434875 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:41.839519 master-0 kubenswrapper[23041]: I0308 00:35:41.839442 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/1.log" Mar 08 00:35:41.840661 master-0 kubenswrapper[23041]: I0308 00:35:41.840585 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/0.log" Mar 08 00:35:41.842281 master-0 kubenswrapper[23041]: I0308 00:35:41.841620 23041 generic.go:334] "Generic (PLEG): container finished" podID="d01c21a1-6c2c-49a7-9d85-254662851838" containerID="dc254aaf3bd5aa2a3c6e69f8abd5a98d092e318f7ea622432462747a16cce142" exitCode=1 Mar 08 00:35:41.842281 master-0 kubenswrapper[23041]: I0308 00:35:41.841728 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerDied","Data":"dc254aaf3bd5aa2a3c6e69f8abd5a98d092e318f7ea622432462747a16cce142"} Mar 08 00:35:41.842281 master-0 kubenswrapper[23041]: I0308 00:35:41.841824 23041 scope.go:117] "RemoveContainer" containerID="f272f0c8300d99d74de3b6533eb08fc6f13727844131b874ef0ec089cec086c7" Mar 08 00:35:41.844115 master-0 kubenswrapper[23041]: I0308 00:35:41.843974 23041 scope.go:117] "RemoveContainer" containerID="dc254aaf3bd5aa2a3c6e69f8abd5a98d092e318f7ea622432462747a16cce142" Mar 08 00:35:42.854003 master-0 kubenswrapper[23041]: I0308 00:35:42.853912 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-w2q2q_d01c21a1-6c2c-49a7-9d85-254662851838/manager/1.log" Mar 08 00:35:42.855029 master-0 kubenswrapper[23041]: I0308 00:35:42.854613 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" event={"ID":"d01c21a1-6c2c-49a7-9d85-254662851838","Type":"ContainerStarted","Data":"e748608116b3a1034a9e9042790bf54d8fab6e248f3bff4dc3fd56f8ffcc5630"} Mar 08 00:35:42.855469 master-0 kubenswrapper[23041]: I0308 00:35:42.855384 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:35:45.887172 master-0 kubenswrapper[23041]: I0308 00:35:45.886953 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/config-sync-controllers/0.log" Mar 08 00:35:45.888582 master-0 kubenswrapper[23041]: I0308 00:35:45.888030 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/cluster-cloud-controller-manager/0.log" Mar 08 00:35:45.888582 master-0 kubenswrapper[23041]: I0308 00:35:45.888182 23041 generic.go:334] "Generic (PLEG): container finished" podID="3b4f8517-1e54-4b41-ba6b-6c56fe66831a" containerID="6b085935f4ebb70afc5a958163f7053b9a42b89c690b039c32d56dcc51668fae" exitCode=1 Mar 08 00:35:45.888582 master-0 kubenswrapper[23041]: I0308 00:35:45.888299 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerDied","Data":"6b085935f4ebb70afc5a958163f7053b9a42b89c690b039c32d56dcc51668fae"} Mar 08 00:35:45.889604 master-0 kubenswrapper[23041]: I0308 00:35:45.889545 23041 scope.go:117] "RemoveContainer" containerID="6b085935f4ebb70afc5a958163f7053b9a42b89c690b039c32d56dcc51668fae" Mar 08 00:35:46.083627 master-0 kubenswrapper[23041]: E0308 00:35:46.083514 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 08 00:35:46.899911 master-0 kubenswrapper[23041]: I0308 00:35:46.899823 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/1.log" Mar 08 00:35:46.901618 master-0 kubenswrapper[23041]: I0308 00:35:46.901553 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/0.log" Mar 08 00:35:46.901736 master-0 kubenswrapper[23041]: I0308 00:35:46.901643 23041 generic.go:334] "Generic (PLEG): container finished" podID="1bb8fea7-71ca-43a3-839d-9c1459bf8dfa" containerID="62a90dd1c822377c4aa48689f26940e9273c8eaf2e5b09cbf6dadaba768ab7d5" exitCode=1 Mar 08 00:35:46.901819 master-0 kubenswrapper[23041]: I0308 00:35:46.901740 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerDied","Data":"62a90dd1c822377c4aa48689f26940e9273c8eaf2e5b09cbf6dadaba768ab7d5"} Mar 08 00:35:46.901889 master-0 kubenswrapper[23041]: I0308 00:35:46.901835 23041 scope.go:117] "RemoveContainer" containerID="1a894ff93f34b75d7c364cee700320b9938207036c1164fc914fd25a46ac6869" Mar 08 00:35:46.902966 master-0 kubenswrapper[23041]: I0308 00:35:46.902894 23041 scope.go:117] "RemoveContainer" containerID="62a90dd1c822377c4aa48689f26940e9273c8eaf2e5b09cbf6dadaba768ab7d5" Mar 08 00:35:46.909475 master-0 kubenswrapper[23041]: I0308 00:35:46.909415 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/config-sync-controllers/0.log" Mar 08 00:35:46.910164 master-0 kubenswrapper[23041]: I0308 00:35:46.910114 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-nwttq_3b4f8517-1e54-4b41-ba6b-6c56fe66831a/cluster-cloud-controller-manager/0.log" Mar 08 00:35:46.910325 master-0 kubenswrapper[23041]: I0308 00:35:46.910293 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-nwttq" event={"ID":"3b4f8517-1e54-4b41-ba6b-6c56fe66831a","Type":"ContainerStarted","Data":"1701212ef949c5c6ff54369b5c2da6f91b3999e3d529f343f4617655c3349fa7"} Mar 08 00:35:47.173638 master-0 kubenswrapper[23041]: I0308 00:35:47.173553 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:47.173837 master-0 kubenswrapper[23041]: I0308 00:35:47.173678 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:47.251000 master-0 kubenswrapper[23041]: I0308 00:35:47.250891 23041 status_manager.go:851] "Failed to get status for pod" podUID="861ba34f-5174-4835-a9b9-dbc5eacd2963" pod="openshift-kube-controller-manager/installer-4-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-4-master-0)" Mar 08 00:35:47.923044 master-0 kubenswrapper[23041]: I0308 00:35:47.922959 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-7nhvs_1bb8fea7-71ca-43a3-839d-9c1459bf8dfa/manager/1.log" Mar 08 00:35:47.923902 master-0 kubenswrapper[23041]: I0308 00:35:47.923543 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" event={"ID":"1bb8fea7-71ca-43a3-839d-9c1459bf8dfa","Type":"ContainerStarted","Data":"4ba808730fd21485874a8dc61e63fd4c79e31f8774908008030f91800fcc316a"} Mar 08 00:35:47.924119 master-0 kubenswrapper[23041]: I0308 00:35:47.924040 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:35:49.434672 master-0 kubenswrapper[23041]: I0308 00:35:49.434599 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:49.435477 master-0 kubenswrapper[23041]: I0308 00:35:49.434701 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:35:55.950675 master-0 kubenswrapper[23041]: I0308 00:35:55.950559 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-7nhvs" Mar 08 00:35:55.957026 master-0 kubenswrapper[23041]: I0308 00:35:55.956954 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-w2q2q" Mar 08 00:35:56.761510 master-0 kubenswrapper[23041]: E0308 00:35:56.761273 23041 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ab675e13818de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:33:40.729690334 +0000 UTC m=+126.202526928,LastTimestamp:2026-03-08 00:33:40.729690334 +0000 UTC m=+126.202526928,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:35:57.174156 master-0 kubenswrapper[23041]: I0308 00:35:57.173893 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:35:57.174156 master-0 kubenswrapper[23041]: I0308 00:35:57.174019 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:35:59.434922 master-0 kubenswrapper[23041]: I0308 00:35:59.434845 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:35:59.436185 master-0 kubenswrapper[23041]: I0308 00:35:59.436129 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:00.048510 master-0 kubenswrapper[23041]: I0308 00:36:00.048430 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/3.log" Mar 08 00:36:00.049144 master-0 kubenswrapper[23041]: I0308 00:36:00.049102 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/2.log" Mar 08 00:36:00.049245 master-0 kubenswrapper[23041]: I0308 00:36:00.049171 23041 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114" exitCode=1 Mar 08 00:36:00.049313 master-0 kubenswrapper[23041]: I0308 00:36:00.049248 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114"} Mar 08 00:36:00.049362 master-0 kubenswrapper[23041]: I0308 00:36:00.049315 23041 scope.go:117] "RemoveContainer" containerID="12285832d9ae011d03a37f69d825d599f3efa2810a8db6a158e7e5aac2654198" Mar 08 00:36:00.050497 master-0 kubenswrapper[23041]: I0308 00:36:00.050369 23041 scope.go:117] "RemoveContainer" containerID="80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114" Mar 08 00:36:00.050765 master-0 kubenswrapper[23041]: E0308 00:36:00.050683 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-vd52m_openshift-cluster-storage-operator(e97435ee-522e-427d-9efc-40bc3d2b0d02)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" podUID="e97435ee-522e-427d-9efc-40bc3d2b0d02" Mar 08 00:36:01.057847 master-0 kubenswrapper[23041]: I0308 00:36:01.057779 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/3.log" Mar 08 00:36:02.074791 master-0 kubenswrapper[23041]: I0308 00:36:02.074704 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:36:02.076943 master-0 kubenswrapper[23041]: I0308 00:36:02.076859 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="68d2d556561da5978f104c45f8ecd8a09a79c04161083561a003cabefd7b6ac9" exitCode=0 Mar 08 00:36:02.076943 master-0 kubenswrapper[23041]: I0308 00:36:02.076922 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"68d2d556561da5978f104c45f8ecd8a09a79c04161083561a003cabefd7b6ac9"} Mar 08 00:36:02.077677 master-0 kubenswrapper[23041]: I0308 00:36:02.077642 23041 scope.go:117] "RemoveContainer" containerID="68d2d556561da5978f104c45f8ecd8a09a79c04161083561a003cabefd7b6ac9" Mar 08 00:36:02.484230 master-0 kubenswrapper[23041]: E0308 00:36:02.484117 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 08 00:36:02.693381 master-0 kubenswrapper[23041]: I0308 00:36:02.693057 23041 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:02.693381 master-0 kubenswrapper[23041]: I0308 00:36:02.693213 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:02.693381 master-0 kubenswrapper[23041]: I0308 00:36:02.693340 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:03.091030 master-0 kubenswrapper[23041]: I0308 00:36:03.090920 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:36:03.093257 master-0 kubenswrapper[23041]: I0308 00:36:03.093165 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487"} Mar 08 00:36:03.686541 master-0 kubenswrapper[23041]: E0308 00:36:03.686406 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:04.104761 master-0 kubenswrapper[23041]: I0308 00:36:04.104700 23041 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="1176b19f1aa30383a0b89f6918e5548d6d2a2c69fab7b7d936e218c18e4346c4" exitCode=0 Mar 08 00:36:04.104761 master-0 kubenswrapper[23041]: I0308 00:36:04.104744 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"1176b19f1aa30383a0b89f6918e5548d6d2a2c69fab7b7d936e218c18e4346c4"} Mar 08 00:36:04.105492 master-0 kubenswrapper[23041]: I0308 00:36:04.105031 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:36:04.105492 master-0 kubenswrapper[23041]: I0308 00:36:04.105046 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:36:05.134224 master-0 kubenswrapper[23041]: I0308 00:36:05.134161 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:36:05.135269 master-0 kubenswrapper[23041]: I0308 00:36:05.135196 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" exitCode=1 Mar 08 00:36:05.135406 master-0 kubenswrapper[23041]: I0308 00:36:05.135255 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} Mar 08 00:36:05.136354 master-0 kubenswrapper[23041]: I0308 00:36:05.136329 23041 scope.go:117] "RemoveContainer" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:36:05.171630 master-0 kubenswrapper[23041]: E0308 00:36:05.171554 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:35:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:35:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:35:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:35:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:06.146754 master-0 kubenswrapper[23041]: I0308 00:36:06.146659 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:36:06.147853 master-0 kubenswrapper[23041]: I0308 00:36:06.147120 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66"} Mar 08 00:36:06.147853 master-0 kubenswrapper[23041]: I0308 00:36:06.147496 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:36:07.173637 master-0 kubenswrapper[23041]: I0308 00:36:07.173541 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:07.174634 master-0 kubenswrapper[23041]: I0308 00:36:07.173630 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:09.177010 master-0 kubenswrapper[23041]: I0308 00:36:09.176952 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/0.log" Mar 08 00:36:09.177010 master-0 kubenswrapper[23041]: I0308 00:36:09.177022 23041 generic.go:334] "Generic (PLEG): container finished" podID="84522c03-fd7b-4be7-9413-84e510b9dc5a" containerID="8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c" exitCode=1 Mar 08 00:36:09.178060 master-0 kubenswrapper[23041]: I0308 00:36:09.177090 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerDied","Data":"8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c"} Mar 08 00:36:09.178060 master-0 kubenswrapper[23041]: I0308 00:36:09.177787 23041 scope.go:117] "RemoveContainer" containerID="8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c" Mar 08 00:36:09.180301 master-0 kubenswrapper[23041]: I0308 00:36:09.180266 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-8krst_460f09d8-a143-48d2-9db0-be247386984a/control-plane-machine-set-operator/0.log" Mar 08 00:36:09.180433 master-0 kubenswrapper[23041]: I0308 00:36:09.180325 23041 generic.go:334] "Generic (PLEG): container finished" podID="460f09d8-a143-48d2-9db0-be247386984a" containerID="da7f059bc7425c70bc4a951221ce223000707cc405db21efd57cd77b538e3498" exitCode=1 Mar 08 00:36:09.180433 master-0 kubenswrapper[23041]: I0308 00:36:09.180355 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerDied","Data":"da7f059bc7425c70bc4a951221ce223000707cc405db21efd57cd77b538e3498"} Mar 08 00:36:09.180753 master-0 kubenswrapper[23041]: I0308 00:36:09.180725 23041 scope.go:117] "RemoveContainer" containerID="da7f059bc7425c70bc4a951221ce223000707cc405db21efd57cd77b538e3498" Mar 08 00:36:09.435289 master-0 kubenswrapper[23041]: I0308 00:36:09.435243 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:09.435587 master-0 kubenswrapper[23041]: I0308 00:36:09.435534 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:10.193745 master-0 kubenswrapper[23041]: I0308 00:36:10.193668 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/0.log" Mar 08 00:36:10.194353 master-0 kubenswrapper[23041]: I0308 00:36:10.193808 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"002b35d9f13fd9b961ace368548088f2a157c38a2a4d1df7e9d9f528e36132e5"} Mar 08 00:36:10.196960 master-0 kubenswrapper[23041]: I0308 00:36:10.196923 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-8krst_460f09d8-a143-48d2-9db0-be247386984a/control-plane-machine-set-operator/0.log" Mar 08 00:36:10.197044 master-0 kubenswrapper[23041]: I0308 00:36:10.196975 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-8krst" event={"ID":"460f09d8-a143-48d2-9db0-be247386984a","Type":"ContainerStarted","Data":"6741b38e940f20380d166a7b1b7f083e9b1886437080f466afbb2dd0c29fd2dc"} Mar 08 00:36:12.692671 master-0 kubenswrapper[23041]: I0308 00:36:12.692576 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:12.692671 master-0 kubenswrapper[23041]: I0308 00:36:12.692653 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:13.808689 master-0 kubenswrapper[23041]: I0308 00:36:13.808608 23041 scope.go:117] "RemoveContainer" containerID="80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114" Mar 08 00:36:14.233598 master-0 kubenswrapper[23041]: I0308 00:36:14.233404 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/3.log" Mar 08 00:36:14.233598 master-0 kubenswrapper[23041]: I0308 00:36:14.233505 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350"} Mar 08 00:36:15.171979 master-0 kubenswrapper[23041]: E0308 00:36:15.171874 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:15.694353 master-0 kubenswrapper[23041]: I0308 00:36:15.693556 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:36:15.694353 master-0 kubenswrapper[23041]: I0308 00:36:15.693667 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:17.173359 master-0 kubenswrapper[23041]: I0308 00:36:17.173254 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:17.173359 master-0 kubenswrapper[23041]: I0308 00:36:17.173364 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:18.270544 master-0 kubenswrapper[23041]: I0308 00:36:18.270475 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-xpl2b_7317ceda-df6f-4826-aa1a-15304c2b0fcd/machine-approver-controller/0.log" Mar 08 00:36:18.271174 master-0 kubenswrapper[23041]: I0308 00:36:18.271139 23041 generic.go:334] "Generic (PLEG): container finished" podID="7317ceda-df6f-4826-aa1a-15304c2b0fcd" containerID="4bf845493478fab338d4b9ab87cadf5b607d6c9eebb501f29c76a34495978f4a" exitCode=255 Mar 08 00:36:18.271309 master-0 kubenswrapper[23041]: I0308 00:36:18.271184 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerDied","Data":"4bf845493478fab338d4b9ab87cadf5b607d6c9eebb501f29c76a34495978f4a"} Mar 08 00:36:18.271899 master-0 kubenswrapper[23041]: I0308 00:36:18.271862 23041 scope.go:117] "RemoveContainer" containerID="4bf845493478fab338d4b9ab87cadf5b607d6c9eebb501f29c76a34495978f4a" Mar 08 00:36:19.285036 master-0 kubenswrapper[23041]: I0308 00:36:19.284934 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-xpl2b_7317ceda-df6f-4826-aa1a-15304c2b0fcd/machine-approver-controller/0.log" Mar 08 00:36:19.286057 master-0 kubenswrapper[23041]: I0308 00:36:19.285616 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-xpl2b" event={"ID":"7317ceda-df6f-4826-aa1a-15304c2b0fcd","Type":"ContainerStarted","Data":"21157871c48968cc85068688236eb6df4ce64ab18bb15aa9a7fa3b3803554c78"} Mar 08 00:36:19.434342 master-0 kubenswrapper[23041]: I0308 00:36:19.434262 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:19.434342 master-0 kubenswrapper[23041]: I0308 00:36:19.434332 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:19.485307 master-0 kubenswrapper[23041]: E0308 00:36:19.485244 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:36:20.297987 master-0 kubenswrapper[23041]: I0308 00:36:20.297890 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/0.log" Mar 08 00:36:20.299099 master-0 kubenswrapper[23041]: I0308 00:36:20.298953 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:36:20.299595 master-0 kubenswrapper[23041]: I0308 00:36:20.299525 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" exitCode=1 Mar 08 00:36:20.299714 master-0 kubenswrapper[23041]: I0308 00:36:20.299591 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} Mar 08 00:36:20.300580 master-0 kubenswrapper[23041]: I0308 00:36:20.300531 23041 scope.go:117] "RemoveContainer" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:36:21.311099 master-0 kubenswrapper[23041]: I0308 00:36:21.311013 23041 generic.go:334] "Generic (PLEG): container finished" podID="3fee96d7-75a7-46e4-9707-7bd292f10b84" containerID="c756595c785c16416805ae901384336bd79f4ee2a5921d1dafe30a90cfdb5b66" exitCode=0 Mar 08 00:36:21.311923 master-0 kubenswrapper[23041]: I0308 00:36:21.311084 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerDied","Data":"c756595c785c16416805ae901384336bd79f4ee2a5921d1dafe30a90cfdb5b66"} Mar 08 00:36:21.311923 master-0 kubenswrapper[23041]: I0308 00:36:21.311158 23041 scope.go:117] "RemoveContainer" containerID="52998e126ba781dde5afc9f3fdb3cf64a817b4497f29c74abbb0c4aa09aa4379" Mar 08 00:36:21.311923 master-0 kubenswrapper[23041]: I0308 00:36:21.311835 23041 scope.go:117] "RemoveContainer" containerID="c756595c785c16416805ae901384336bd79f4ee2a5921d1dafe30a90cfdb5b66" Mar 08 00:36:21.315301 master-0 kubenswrapper[23041]: I0308 00:36:21.315268 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/0.log" Mar 08 00:36:21.316014 master-0 kubenswrapper[23041]: I0308 00:36:21.315964 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:36:21.316492 master-0 kubenswrapper[23041]: I0308 00:36:21.316455 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f"} Mar 08 00:36:22.325696 master-0 kubenswrapper[23041]: I0308 00:36:22.325619 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-m77x2" event={"ID":"3fee96d7-75a7-46e4-9707-7bd292f10b84","Type":"ContainerStarted","Data":"502f543d19517b48f57908aebf069b6618d049fc7b434141c2966a773df48ad7"} Mar 08 00:36:25.173157 master-0 kubenswrapper[23041]: E0308 00:36:25.173052 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:25.693471 master-0 kubenswrapper[23041]: I0308 00:36:25.693408 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:36:25.693932 master-0 kubenswrapper[23041]: I0308 00:36:25.693893 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:27.173368 master-0 kubenswrapper[23041]: I0308 00:36:27.173197 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:27.173368 master-0 kubenswrapper[23041]: I0308 00:36:27.173368 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:27.386444 master-0 kubenswrapper[23041]: I0308 00:36:27.386342 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ef25237-ab1c-41a6-a0a7-07642094de26" containerID="f6365e505366ec41e1d8493468c3de2f623d6298fe0f596459357802845842ee" exitCode=0 Mar 08 00:36:27.386444 master-0 kubenswrapper[23041]: I0308 00:36:27.386405 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" event={"ID":"2ef25237-ab1c-41a6-a0a7-07642094de26","Type":"ContainerDied","Data":"f6365e505366ec41e1d8493468c3de2f623d6298fe0f596459357802845842ee"} Mar 08 00:36:27.387073 master-0 kubenswrapper[23041]: I0308 00:36:27.387032 23041 scope.go:117] "RemoveContainer" containerID="f6365e505366ec41e1d8493468c3de2f623d6298fe0f596459357802845842ee" Mar 08 00:36:28.398825 master-0 kubenswrapper[23041]: I0308 00:36:28.398716 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" event={"ID":"2ef25237-ab1c-41a6-a0a7-07642094de26","Type":"ContainerStarted","Data":"0c0d68464c27e5ba353a92f1cd19f1e864598785f0cc2ef250fbbb0d2e2dd9ba"} Mar 08 00:36:28.399814 master-0 kubenswrapper[23041]: I0308 00:36:28.399124 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:36:28.403377 master-0 kubenswrapper[23041]: I0308 00:36:28.403326 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ddc94864c-7nwdc" Mar 08 00:36:29.434754 master-0 kubenswrapper[23041]: I0308 00:36:29.434681 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:29.435766 master-0 kubenswrapper[23041]: I0308 00:36:29.434771 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:30.764625 master-0 kubenswrapper[23041]: E0308 00:36:30.764321 23041 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 08 00:36:30.764625 master-0 kubenswrapper[23041]: &Event{ObjectMeta:{console-6479f6d896-j6kqz.189ab6706537d268 openshift-console 14978 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-6479f6d896-j6kqz,UID:67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b,APIVersion:v1,ResourceVersion:14612,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.103:8443/health": dial tcp 10.128.0.103:8443: connect: connection refused Mar 08 00:36:30.764625 master-0 kubenswrapper[23041]: body: Mar 08 00:36:30.764625 master-0 kubenswrapper[23041]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:33:17 +0000 UTC,LastTimestamp:2026-03-08 00:33:47.17314611 +0000 UTC m=+132.645982714,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 08 00:36:30.764625 master-0 kubenswrapper[23041]: > Mar 08 00:36:32.693146 master-0 kubenswrapper[23041]: I0308 00:36:32.693077 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 08 00:36:32.693838 master-0 kubenswrapper[23041]: I0308 00:36:32.693158 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 08 00:36:32.693838 master-0 kubenswrapper[23041]: I0308 00:36:32.693255 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:32.694937 master-0 kubenswrapper[23041]: I0308 00:36:32.694887 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 00:36:32.695031 master-0 kubenswrapper[23041]: I0308 00:36:32.694991 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" containerID="cri-o://f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487" gracePeriod=30 Mar 08 00:36:33.455767 master-0 kubenswrapper[23041]: I0308 00:36:33.455718 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/1.log" Mar 08 00:36:33.456611 master-0 kubenswrapper[23041]: I0308 00:36:33.456586 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:36:33.457748 master-0 kubenswrapper[23041]: I0308 00:36:33.457699 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487" exitCode=255 Mar 08 00:36:33.457835 master-0 kubenswrapper[23041]: I0308 00:36:33.457795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487"} Mar 08 00:36:33.457895 master-0 kubenswrapper[23041]: I0308 00:36:33.457840 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9"} Mar 08 00:36:33.457895 master-0 kubenswrapper[23041]: I0308 00:36:33.457862 23041 scope.go:117] "RemoveContainer" containerID="68d2d556561da5978f104c45f8ecd8a09a79c04161083561a003cabefd7b6ac9" Mar 08 00:36:34.468480 master-0 kubenswrapper[23041]: I0308 00:36:34.468415 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/1.log" Mar 08 00:36:34.469829 master-0 kubenswrapper[23041]: I0308 00:36:34.469795 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:36:34.814945 master-0 kubenswrapper[23041]: I0308 00:36:34.814886 23041 kubelet.go:1505] "Image garbage collection succeeded" Mar 08 00:36:35.174346 master-0 kubenswrapper[23041]: E0308 00:36:35.174154 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:36.486514 master-0 kubenswrapper[23041]: E0308 00:36:36.486443 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:36:37.173643 master-0 kubenswrapper[23041]: I0308 00:36:37.173532 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:37.174053 master-0 kubenswrapper[23041]: I0308 00:36:37.173638 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:38.107609 master-0 kubenswrapper[23041]: E0308 00:36:38.107479 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:38.510551 master-0 kubenswrapper[23041]: I0308 00:36:38.510501 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"196dbd68be02322b0914410604ba4b5adb7eb87c7e033a8aabc14cb4aea21ef5"} Mar 08 00:36:39.435194 master-0 kubenswrapper[23041]: I0308 00:36:39.435063 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:39.435194 master-0 kubenswrapper[23041]: I0308 00:36:39.435140 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:39.525097 master-0 kubenswrapper[23041]: I0308 00:36:39.525028 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"7204360a0a1c9267b2da7fc33e39e2f234f58c3eacffbacad8ce71eb1c441e57"} Mar 08 00:36:39.525305 master-0 kubenswrapper[23041]: I0308 00:36:39.525101 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"dfc117efa2cc6d3c52b60b1f7db8e1c6e94ca9eb6388dd8eae8d975677f20f3f"} Mar 08 00:36:39.525305 master-0 kubenswrapper[23041]: I0308 00:36:39.525123 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"20b05503bab56e55ea4f8fe0f3edd2998190abe86cc2d71d0ad6bb13e59e4740"} Mar 08 00:36:40.548649 master-0 kubenswrapper[23041]: I0308 00:36:40.548574 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"4d32aa85b8d75a34350c349cd32fee8e6fb5463c0e2b26e2d7c06ba96141dc36"} Mar 08 00:36:40.550142 master-0 kubenswrapper[23041]: I0308 00:36:40.549162 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:36:40.550142 master-0 kubenswrapper[23041]: I0308 00:36:40.549489 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:36:42.692809 master-0 kubenswrapper[23041]: I0308 00:36:42.692754 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:42.692809 master-0 kubenswrapper[23041]: I0308 00:36:42.692813 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:36:43.831097 master-0 kubenswrapper[23041]: I0308 00:36:43.831011 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:43.831097 master-0 kubenswrapper[23041]: I0308 00:36:43.831112 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:44.582351 master-0 kubenswrapper[23041]: I0308 00:36:44.582268 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/4.log" Mar 08 00:36:44.583343 master-0 kubenswrapper[23041]: I0308 00:36:44.583278 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/3.log" Mar 08 00:36:44.583469 master-0 kubenswrapper[23041]: I0308 00:36:44.583380 23041 generic.go:334] "Generic (PLEG): container finished" podID="e97435ee-522e-427d-9efc-40bc3d2b0d02" containerID="997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350" exitCode=1 Mar 08 00:36:44.583469 master-0 kubenswrapper[23041]: I0308 00:36:44.583433 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerDied","Data":"997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350"} Mar 08 00:36:44.583605 master-0 kubenswrapper[23041]: I0308 00:36:44.583498 23041 scope.go:117] "RemoveContainer" containerID="80566af5865f46048bb53d6f98fcbe8ba40094b34956912b1870f5b960f85114" Mar 08 00:36:44.584619 master-0 kubenswrapper[23041]: I0308 00:36:44.584567 23041 scope.go:117] "RemoveContainer" containerID="997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350" Mar 08 00:36:44.585038 master-0 kubenswrapper[23041]: E0308 00:36:44.584968 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-vd52m_openshift-cluster-storage-operator(e97435ee-522e-427d-9efc-40bc3d2b0d02)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" podUID="e97435ee-522e-427d-9efc-40bc3d2b0d02" Mar 08 00:36:45.175098 master-0 kubenswrapper[23041]: E0308 00:36:45.175001 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:45.175098 master-0 kubenswrapper[23041]: E0308 00:36:45.175061 23041 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:36:45.596360 master-0 kubenswrapper[23041]: I0308 00:36:45.596196 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/4.log" Mar 08 00:36:45.693903 master-0 kubenswrapper[23041]: I0308 00:36:45.693822 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 08 00:36:45.694158 master-0 kubenswrapper[23041]: I0308 00:36:45.693935 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 08 00:36:47.173503 master-0 kubenswrapper[23041]: I0308 00:36:47.173393 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:47.174816 master-0 kubenswrapper[23041]: I0308 00:36:47.173519 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:47.252888 master-0 kubenswrapper[23041]: I0308 00:36:47.252785 23041 status_manager.go:851] "Failed to get status for pod" podUID="29c709c82970b529e7b9b895aa92ef05" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 08 00:36:49.435018 master-0 kubenswrapper[23041]: I0308 00:36:49.434957 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:49.435018 master-0 kubenswrapper[23041]: I0308 00:36:49.435019 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:36:52.536295 master-0 kubenswrapper[23041]: I0308 00:36:52.536245 23041 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:36:52.536888 master-0 kubenswrapper[23041]: I0308 00:36:52.536321 23041 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:36:52.536990 master-0 kubenswrapper[23041]: I0308 00:36:52.536944 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:52.537042 master-0 kubenswrapper[23041]: I0308 00:36:52.536853 23041 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:53.489127 master-0 kubenswrapper[23041]: E0308 00:36:53.488930 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:36:53.867613 master-0 kubenswrapper[23041]: I0308 00:36:53.867480 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:55.694151 master-0 kubenswrapper[23041]: I0308 00:36:55.694076 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:36:55.694893 master-0 kubenswrapper[23041]: I0308 00:36:55.694153 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:36:57.173196 master-0 kubenswrapper[23041]: I0308 00:36:57.173110 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:36:57.173837 master-0 kubenswrapper[23041]: I0308 00:36:57.173249 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:36:57.808433 master-0 kubenswrapper[23041]: I0308 00:36:57.808388 23041 scope.go:117] "RemoveContainer" containerID="997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350" Mar 08 00:36:57.808990 master-0 kubenswrapper[23041]: E0308 00:36:57.808961 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-vd52m_openshift-cluster-storage-operator(e97435ee-522e-427d-9efc-40bc3d2b0d02)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" podUID="e97435ee-522e-427d-9efc-40bc3d2b0d02" Mar 08 00:36:58.854777 master-0 kubenswrapper[23041]: I0308 00:36:58.854660 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 00:36:59.434896 master-0 kubenswrapper[23041]: I0308 00:36:59.434816 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:36:59.434896 master-0 kubenswrapper[23041]: I0308 00:36:59.434884 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:01.540356 master-0 kubenswrapper[23041]: I0308 00:37:01.540271 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:37:03.604327 master-0 kubenswrapper[23041]: I0308 00:37:03.604138 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:46284->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 08 00:37:03.604327 master-0 kubenswrapper[23041]: I0308 00:37:03.604230 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:46284->127.0.0.1:10357: read: connection reset by peer" Mar 08 00:37:03.604327 master-0 kubenswrapper[23041]: I0308 00:37:03.604285 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:37:03.605501 master-0 kubenswrapper[23041]: I0308 00:37:03.605122 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 00:37:03.605501 master-0 kubenswrapper[23041]: I0308 00:37:03.605224 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" containerID="cri-o://760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9" gracePeriod=30 Mar 08 00:37:03.773412 master-0 kubenswrapper[23041]: I0308 00:37:03.773340 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/2.log" Mar 08 00:37:03.775756 master-0 kubenswrapper[23041]: I0308 00:37:03.774315 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/1.log" Mar 08 00:37:03.775949 master-0 kubenswrapper[23041]: I0308 00:37:03.775901 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:37:03.777057 master-0 kubenswrapper[23041]: I0308 00:37:03.777008 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9" exitCode=255 Mar 08 00:37:03.777116 master-0 kubenswrapper[23041]: I0308 00:37:03.777068 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9"} Mar 08 00:37:03.777116 master-0 kubenswrapper[23041]: I0308 00:37:03.777110 23041 scope.go:117] "RemoveContainer" containerID="f77b387d0621dd0b1c2cc2e9182105197b90e6cce992b464d767da923bbaf487" Mar 08 00:37:04.788732 master-0 kubenswrapper[23041]: I0308 00:37:04.788650 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/2.log" Mar 08 00:37:04.790738 master-0 kubenswrapper[23041]: I0308 00:37:04.790693 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:37:04.791719 master-0 kubenswrapper[23041]: I0308 00:37:04.791659 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} Mar 08 00:37:07.172846 master-0 kubenswrapper[23041]: I0308 00:37:07.172763 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:07.173612 master-0 kubenswrapper[23041]: I0308 00:37:07.172858 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:08.811244 master-0 kubenswrapper[23041]: I0308 00:37:08.811101 23041 scope.go:117] "RemoveContainer" containerID="997e59f9ca748e13c95369b2b430cd81418e5f12cea9c358b310a332fb868350" Mar 08 00:37:09.435381 master-0 kubenswrapper[23041]: I0308 00:37:09.435314 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:09.435648 master-0 kubenswrapper[23041]: I0308 00:37:09.435406 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:09.836014 master-0 kubenswrapper[23041]: I0308 00:37:09.835924 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-vd52m_e97435ee-522e-427d-9efc-40bc3d2b0d02/snapshot-controller/4.log" Mar 08 00:37:09.836787 master-0 kubenswrapper[23041]: I0308 00:37:09.836025 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-vd52m" event={"ID":"e97435ee-522e-427d-9efc-40bc3d2b0d02","Type":"ContainerStarted","Data":"aa574e8fc689488c900665c4a9000b468c84e69a2d16cdf3a5460eeb89b349a1"} Mar 08 00:37:10.490919 master-0 kubenswrapper[23041]: E0308 00:37:10.490810 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:37:12.693240 master-0 kubenswrapper[23041]: I0308 00:37:12.693160 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:37:12.694113 master-0 kubenswrapper[23041]: I0308 00:37:12.693324 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:37:14.552064 master-0 kubenswrapper[23041]: E0308 00:37:14.551972 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:37:14.874089 master-0 kubenswrapper[23041]: I0308 00:37:14.873763 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:37:14.874089 master-0 kubenswrapper[23041]: I0308 00:37:14.873983 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:37:15.472582 master-0 kubenswrapper[23041]: E0308 00:37:15.472407 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:37:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:37:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:37:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:37:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:15.694294 master-0 kubenswrapper[23041]: I0308 00:37:15.693026 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:37:15.694294 master-0 kubenswrapper[23041]: I0308 00:37:15.693114 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:17.173294 master-0 kubenswrapper[23041]: I0308 00:37:17.173140 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:17.173294 master-0 kubenswrapper[23041]: I0308 00:37:17.173277 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:19.435440 master-0 kubenswrapper[23041]: I0308 00:37:19.435359 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:19.437105 master-0 kubenswrapper[23041]: I0308 00:37:19.435447 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:25.473263 master-0 kubenswrapper[23041]: E0308 00:37:25.473133 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:25.693714 master-0 kubenswrapper[23041]: I0308 00:37:25.693636 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:37:25.693913 master-0 kubenswrapper[23041]: I0308 00:37:25.693711 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:27.173582 master-0 kubenswrapper[23041]: I0308 00:37:27.173481 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:27.174403 master-0 kubenswrapper[23041]: I0308 00:37:27.173587 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:27.492135 master-0 kubenswrapper[23041]: E0308 00:37:27.491913 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:37:29.434786 master-0 kubenswrapper[23041]: I0308 00:37:29.434717 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:29.435870 master-0 kubenswrapper[23041]: I0308 00:37:29.435481 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:34.964158 master-0 kubenswrapper[23041]: I0308 00:37:34.964079 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:43960->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 08 00:37:34.965417 master-0 kubenswrapper[23041]: I0308 00:37:34.964168 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:43960->127.0.0.1:10357: read: connection reset by peer" Mar 08 00:37:34.965417 master-0 kubenswrapper[23041]: I0308 00:37:34.964260 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:37:34.965417 master-0 kubenswrapper[23041]: I0308 00:37:34.965276 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 00:37:34.967755 master-0 kubenswrapper[23041]: I0308 00:37:34.965454 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" containerID="cri-o://3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" gracePeriod=30 Mar 08 00:37:35.033380 master-0 kubenswrapper[23041]: I0308 00:37:35.033316 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/3.log" Mar 08 00:37:35.034024 master-0 kubenswrapper[23041]: I0308 00:37:35.033981 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/2.log" Mar 08 00:37:35.035076 master-0 kubenswrapper[23041]: I0308 00:37:35.035043 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:37:35.035953 master-0 kubenswrapper[23041]: I0308 00:37:35.035897 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" exitCode=255 Mar 08 00:37:35.036030 master-0 kubenswrapper[23041]: I0308 00:37:35.035955 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerDied","Data":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} Mar 08 00:37:35.036172 master-0 kubenswrapper[23041]: I0308 00:37:35.036031 23041 scope.go:117] "RemoveContainer" containerID="760a26fed2f93cc2a4282a5d936c275cd755f53d5897c4bf5d65fda0551ee6d9" Mar 08 00:37:35.473704 master-0 kubenswrapper[23041]: E0308 00:37:35.473566 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:35.495760 master-0 kubenswrapper[23041]: E0308 00:37:35.495710 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2ab662059bb326d13a07bf5700e4f545)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" Mar 08 00:37:36.046683 master-0 kubenswrapper[23041]: I0308 00:37:36.046624 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/3.log" Mar 08 00:37:36.047888 master-0 kubenswrapper[23041]: I0308 00:37:36.047851 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:37:36.049712 master-0 kubenswrapper[23041]: I0308 00:37:36.049670 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:37:36.050024 master-0 kubenswrapper[23041]: E0308 00:37:36.049989 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2ab662059bb326d13a07bf5700e4f545)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" Mar 08 00:37:37.172753 master-0 kubenswrapper[23041]: I0308 00:37:37.172661 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:37.173885 master-0 kubenswrapper[23041]: I0308 00:37:37.172750 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:39.435420 master-0 kubenswrapper[23041]: I0308 00:37:39.435358 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:39.435957 master-0 kubenswrapper[23041]: I0308 00:37:39.435456 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:42.249644 master-0 kubenswrapper[23041]: I0308 00:37:42.249554 23041 scope.go:117] "RemoveContainer" containerID="d9e68f104ff64d94c7bc0d96bb172cf910cbd61300635334957f518556f38bfc" Mar 08 00:37:42.695565 master-0 kubenswrapper[23041]: I0308 00:37:42.692526 23041 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:37:42.695565 master-0 kubenswrapper[23041]: I0308 00:37:42.694535 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:37:42.695565 master-0 kubenswrapper[23041]: E0308 00:37:42.695132 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2ab662059bb326d13a07bf5700e4f545)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" Mar 08 00:37:44.493442 master-0 kubenswrapper[23041]: E0308 00:37:44.493368 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 00:37:45.474890 master-0 kubenswrapper[23041]: E0308 00:37:45.474587 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:47.173144 master-0 kubenswrapper[23041]: I0308 00:37:47.172987 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:47.173984 master-0 kubenswrapper[23041]: I0308 00:37:47.173165 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:47.254997 master-0 kubenswrapper[23041]: I0308 00:37:47.254871 23041 status_manager.go:851] "Failed to get status for pod" podUID="0af76e72-367d-4d11-8c55-8758aa5003dd" pod="openshift-kube-scheduler/installer-6-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-6-master-0)" Mar 08 00:37:48.877313 master-0 kubenswrapper[23041]: E0308 00:37:48.877225 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 00:37:49.435628 master-0 kubenswrapper[23041]: I0308 00:37:49.435531 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:49.435882 master-0 kubenswrapper[23041]: I0308 00:37:49.435637 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:37:52.184345 master-0 kubenswrapper[23041]: I0308 00:37:52.184277 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/1.log" Mar 08 00:37:52.185056 master-0 kubenswrapper[23041]: I0308 00:37:52.185044 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/0.log" Mar 08 00:37:52.185118 master-0 kubenswrapper[23041]: I0308 00:37:52.185085 23041 generic.go:334] "Generic (PLEG): container finished" podID="84522c03-fd7b-4be7-9413-84e510b9dc5a" containerID="002b35d9f13fd9b961ace368548088f2a157c38a2a4d1df7e9d9f528e36132e5" exitCode=1 Mar 08 00:37:52.185168 master-0 kubenswrapper[23041]: I0308 00:37:52.185131 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerDied","Data":"002b35d9f13fd9b961ace368548088f2a157c38a2a4d1df7e9d9f528e36132e5"} Mar 08 00:37:52.185247 master-0 kubenswrapper[23041]: I0308 00:37:52.185227 23041 scope.go:117] "RemoveContainer" containerID="8db7391cc36022b8c4fa21dd3d33b8e00c7e53dfad0cc53ffef3d1fff055fc5c" Mar 08 00:37:52.185718 master-0 kubenswrapper[23041]: I0308 00:37:52.185682 23041 scope.go:117] "RemoveContainer" containerID="002b35d9f13fd9b961ace368548088f2a157c38a2a4d1df7e9d9f528e36132e5" Mar 08 00:37:52.185949 master-0 kubenswrapper[23041]: E0308 00:37:52.185915 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-qldx6_openshift-machine-api(84522c03-fd7b-4be7-9413-84e510b9dc5a)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" podUID="84522c03-fd7b-4be7-9413-84e510b9dc5a" Mar 08 00:37:53.193674 master-0 kubenswrapper[23041]: I0308 00:37:53.193606 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/1.log" Mar 08 00:37:54.808805 master-0 kubenswrapper[23041]: I0308 00:37:54.808726 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:37:54.809927 master-0 kubenswrapper[23041]: E0308 00:37:54.808958 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2ab662059bb326d13a07bf5700e4f545)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" Mar 08 00:37:55.475760 master-0 kubenswrapper[23041]: E0308 00:37:55.475521 23041 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:37:55.475760 master-0 kubenswrapper[23041]: E0308 00:37:55.475747 23041 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:37:57.173061 master-0 kubenswrapper[23041]: I0308 00:37:57.172992 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:37:57.173618 master-0 kubenswrapper[23041]: I0308 00:37:57.173087 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:37:59.434906 master-0 kubenswrapper[23041]: I0308 00:37:59.434834 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:37:59.435560 master-0 kubenswrapper[23041]: I0308 00:37:59.434919 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:02.270249 master-0 kubenswrapper[23041]: I0308 00:38:02.270136 23041 generic.go:334] "Generic (PLEG): container finished" podID="0f496486-70d5-4c5c-b4f3-6cc19427762f" containerID="f74d256abcdb5398186b869309f30f30a8ba6d7a0454838bd1b4e98ad498b4cd" exitCode=0 Mar 08 00:38:02.271177 master-0 kubenswrapper[23041]: I0308 00:38:02.270269 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerDied","Data":"f74d256abcdb5398186b869309f30f30a8ba6d7a0454838bd1b4e98ad498b4cd"} Mar 08 00:38:02.271177 master-0 kubenswrapper[23041]: I0308 00:38:02.270870 23041 scope.go:117] "RemoveContainer" containerID="f74d256abcdb5398186b869309f30f30a8ba6d7a0454838bd1b4e98ad498b4cd" Mar 08 00:38:02.273849 master-0 kubenswrapper[23041]: I0308 00:38:02.273802 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-27phk_2fbed2b8-f4c5-4f52-b29c-1907a2034f6f/etcd-operator/1.log" Mar 08 00:38:02.273849 master-0 kubenswrapper[23041]: I0308 00:38:02.273851 23041 generic.go:334] "Generic (PLEG): container finished" podID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerID="94f6cbcf36ce22a8ad98b49d60bec50375421ad5c3b08a57f781b8f9d633b332" exitCode=0 Mar 08 00:38:02.274334 master-0 kubenswrapper[23041]: I0308 00:38:02.273939 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerDied","Data":"94f6cbcf36ce22a8ad98b49d60bec50375421ad5c3b08a57f781b8f9d633b332"} Mar 08 00:38:02.274334 master-0 kubenswrapper[23041]: I0308 00:38:02.273996 23041 scope.go:117] "RemoveContainer" containerID="d2e8edf542df46c295f392d43d676bb039cfcddee9661264a6bee3005ba21922" Mar 08 00:38:02.275165 master-0 kubenswrapper[23041]: I0308 00:38:02.275091 23041 scope.go:117] "RemoveContainer" containerID="94f6cbcf36ce22a8ad98b49d60bec50375421ad5c3b08a57f781b8f9d633b332" Mar 08 00:38:02.277252 master-0 kubenswrapper[23041]: I0308 00:38:02.277166 23041 generic.go:334] "Generic (PLEG): container finished" podID="9d810f7f-258a-47ce-9f99-7b1d93388aee" containerID="4ade0408e709b8d3bfa126728a922decfde81b90bd3f67b5bee03661da1d8a83" exitCode=0 Mar 08 00:38:02.277401 master-0 kubenswrapper[23041]: I0308 00:38:02.277336 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerDied","Data":"4ade0408e709b8d3bfa126728a922decfde81b90bd3f67b5bee03661da1d8a83"} Mar 08 00:38:02.278171 master-0 kubenswrapper[23041]: I0308 00:38:02.278118 23041 scope.go:117] "RemoveContainer" containerID="4ade0408e709b8d3bfa126728a922decfde81b90bd3f67b5bee03661da1d8a83" Mar 08 00:38:02.282018 master-0 kubenswrapper[23041]: I0308 00:38:02.281938 23041 generic.go:334] "Generic (PLEG): container finished" podID="5a229b84-65bd-493b-90dd-b8194f842dc8" containerID="40763ecf359c193fdc57eccfc3f99287edfc631f03df7363e0563b373121c528" exitCode=0 Mar 08 00:38:02.282018 master-0 kubenswrapper[23041]: I0308 00:38:02.281992 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerDied","Data":"40763ecf359c193fdc57eccfc3f99287edfc631f03df7363e0563b373121c528"} Mar 08 00:38:02.282638 master-0 kubenswrapper[23041]: I0308 00:38:02.282595 23041 scope.go:117] "RemoveContainer" containerID="40763ecf359c193fdc57eccfc3f99287edfc631f03df7363e0563b373121c528" Mar 08 00:38:02.284287 master-0 kubenswrapper[23041]: I0308 00:38:02.284220 23041 generic.go:334] "Generic (PLEG): container finished" podID="c2ce2ea7-bd25-4294-8f3a-11ce53577830" containerID="632cf41c6d751c39c9bc533a8eb31489a926eb05ad69c14fc4cbdd3ab7d57165" exitCode=0 Mar 08 00:38:02.284435 master-0 kubenswrapper[23041]: I0308 00:38:02.284322 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerDied","Data":"632cf41c6d751c39c9bc533a8eb31489a926eb05ad69c14fc4cbdd3ab7d57165"} Mar 08 00:38:02.285068 master-0 kubenswrapper[23041]: I0308 00:38:02.284848 23041 scope.go:117] "RemoveContainer" containerID="632cf41c6d751c39c9bc533a8eb31489a926eb05ad69c14fc4cbdd3ab7d57165" Mar 08 00:38:02.290809 master-0 kubenswrapper[23041]: I0308 00:38:02.288314 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-phgxj_8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/package-server-manager/0.log" Mar 08 00:38:02.290809 master-0 kubenswrapper[23041]: I0308 00:38:02.288578 23041 generic.go:334] "Generic (PLEG): container finished" podID="8f71fd39-a16b-47d2-b781-c8ce37bcb9b2" containerID="7b9f0eb1c41cef5d8230e9e1038d90bce9d1d6ac13eb84abd28591cfa2cf66a5" exitCode=1 Mar 08 00:38:02.290809 master-0 kubenswrapper[23041]: I0308 00:38:02.288647 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerDied","Data":"7b9f0eb1c41cef5d8230e9e1038d90bce9d1d6ac13eb84abd28591cfa2cf66a5"} Mar 08 00:38:02.290809 master-0 kubenswrapper[23041]: I0308 00:38:02.289191 23041 scope.go:117] "RemoveContainer" containerID="7b9f0eb1c41cef5d8230e9e1038d90bce9d1d6ac13eb84abd28591cfa2cf66a5" Mar 08 00:38:02.296413 master-0 kubenswrapper[23041]: I0308 00:38:02.296391 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-dpg4q_3d2e1686-3a30-4021-9c03-02e472bc6ff3/cluster-autoscaler-operator/0.log" Mar 08 00:38:02.297402 master-0 kubenswrapper[23041]: I0308 00:38:02.297344 23041 generic.go:334] "Generic (PLEG): container finished" podID="3d2e1686-3a30-4021-9c03-02e472bc6ff3" containerID="34ce99c1480780527cadfa670226036ef9c17ba4caf6288b67da10db8e7da68e" exitCode=255 Mar 08 00:38:02.297518 master-0 kubenswrapper[23041]: I0308 00:38:02.297500 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerDied","Data":"34ce99c1480780527cadfa670226036ef9c17ba4caf6288b67da10db8e7da68e"} Mar 08 00:38:02.298285 master-0 kubenswrapper[23041]: I0308 00:38:02.298269 23041 scope.go:117] "RemoveContainer" containerID="34ce99c1480780527cadfa670226036ef9c17ba4caf6288b67da10db8e7da68e" Mar 08 00:38:02.325784 master-0 kubenswrapper[23041]: I0308 00:38:02.325732 23041 scope.go:117] "RemoveContainer" containerID="8c7c5dbb2587ce1659649afce2da4e5a5c04c0ab193dda1e438bb8ca083926e4" Mar 08 00:38:03.320419 master-0 kubenswrapper[23041]: I0308 00:38:03.320188 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-sdsks" event={"ID":"0f496486-70d5-4c5c-b4f3-6cc19427762f","Type":"ContainerStarted","Data":"c03f2daeb84dde79a22e743be473966d3efa3712aaa998dea48d580e7ee8578f"} Mar 08 00:38:03.327160 master-0 kubenswrapper[23041]: I0308 00:38:03.326568 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" event={"ID":"2fbed2b8-f4c5-4f52-b29c-1907a2034f6f","Type":"ContainerStarted","Data":"d1308e682ea78cc21ba4cf7219f5398ebea174f17a39ce80fe442ed82b55c605"} Mar 08 00:38:03.330304 master-0 kubenswrapper[23041]: I0308 00:38:03.330246 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-5nbfk" event={"ID":"9d810f7f-258a-47ce-9f99-7b1d93388aee","Type":"ContainerStarted","Data":"97e49166a7207568ae2c86d0dde51013e67ac2af87d4eafd468385822d2dcba4"} Mar 08 00:38:03.337641 master-0 kubenswrapper[23041]: I0308 00:38:03.337579 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-vm7rj" event={"ID":"5a229b84-65bd-493b-90dd-b8194f842dc8","Type":"ContainerStarted","Data":"485c7de03b4d22713bfb8e28525343a513454265b98b440e4cea0e4dc53c45d7"} Mar 08 00:38:03.346464 master-0 kubenswrapper[23041]: I0308 00:38:03.344418 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-phgxj_8f71fd39-a16b-47d2-b781-c8ce37bcb9b2/package-server-manager/0.log" Mar 08 00:38:03.346464 master-0 kubenswrapper[23041]: I0308 00:38:03.344801 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" event={"ID":"8f71fd39-a16b-47d2-b781-c8ce37bcb9b2","Type":"ContainerStarted","Data":"fa65266ea7ec0f05a92ff8b98d039af7433be59d8e365a14fe449eb6c4aff371"} Mar 08 00:38:03.346464 master-0 kubenswrapper[23041]: I0308 00:38:03.345625 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:38:03.349821 master-0 kubenswrapper[23041]: I0308 00:38:03.347675 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-p8hlq" event={"ID":"c2ce2ea7-bd25-4294-8f3a-11ce53577830","Type":"ContainerStarted","Data":"9e0e23831716915db776e01be4096d1cb715102f4807f9281bb2ed18e9a486ec"} Mar 08 00:38:03.349821 master-0 kubenswrapper[23041]: I0308 00:38:03.349729 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-dpg4q_3d2e1686-3a30-4021-9c03-02e472bc6ff3/cluster-autoscaler-operator/0.log" Mar 08 00:38:03.353154 master-0 kubenswrapper[23041]: I0308 00:38:03.350373 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dpg4q" event={"ID":"3d2e1686-3a30-4021-9c03-02e472bc6ff3","Type":"ContainerStarted","Data":"69eb93d84bc43063eb4a5f355ff35ed488649b9db3692543dc7a04fb17447945"} Mar 08 00:38:03.808936 master-0 kubenswrapper[23041]: I0308 00:38:03.808865 23041 scope.go:117] "RemoveContainer" containerID="002b35d9f13fd9b961ace368548088f2a157c38a2a4d1df7e9d9f528e36132e5" Mar 08 00:38:04.361096 master-0 kubenswrapper[23041]: I0308 00:38:04.361040 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-qldx6_84522c03-fd7b-4be7-9413-84e510b9dc5a/cluster-baremetal-operator/1.log" Mar 08 00:38:04.362001 master-0 kubenswrapper[23041]: I0308 00:38:04.361932 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-qldx6" event={"ID":"84522c03-fd7b-4be7-9413-84e510b9dc5a","Type":"ContainerStarted","Data":"95957947fe07d5c9d3e12dd79a8e7f6419f691fd5d8dacfae2dcf16a97b4bb7f"} Mar 08 00:38:06.304364 master-0 kubenswrapper[23041]: I0308 00:38:06.304302 23041 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-27phk container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 08 00:38:06.305122 master-0 kubenswrapper[23041]: I0308 00:38:06.304370 23041 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-27phk" podUID="2fbed2b8-f4c5-4f52-b29c-1907a2034f6f" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 08 00:38:07.172514 master-0 kubenswrapper[23041]: I0308 00:38:07.172446 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:07.172747 master-0 kubenswrapper[23041]: I0308 00:38:07.172527 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:07.172747 master-0 kubenswrapper[23041]: I0308 00:38:07.172585 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:38:07.173408 master-0 kubenswrapper[23041]: I0308 00:38:07.173381 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf"} pod="openshift-console/console-6479f6d896-j6kqz" containerMessage="Container console failed startup probe, will be restarted" Mar 08 00:38:08.027846 master-0 kubenswrapper[23041]: E0308 00:38:08.027770 23041 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-6479f6d896-j6kqz" message="" Mar 08 00:38:08.028259 master-0 kubenswrapper[23041]: E0308 00:38:08.027839 23041 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" containerID="cri-o://42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf" Mar 08 00:38:08.028259 master-0 kubenswrapper[23041]: I0308 00:38:08.027902 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" containerID="cri-o://42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf" gracePeriod=40 Mar 08 00:38:08.408326 master-0 kubenswrapper[23041]: I0308 00:38:08.408176 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6479f6d896-j6kqz_67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/console/0.log" Mar 08 00:38:08.408326 master-0 kubenswrapper[23041]: I0308 00:38:08.408274 23041 generic.go:334] "Generic (PLEG): container finished" podID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerID="42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf" exitCode=255 Mar 08 00:38:08.408326 master-0 kubenswrapper[23041]: I0308 00:38:08.408316 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerDied","Data":"42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf"} Mar 08 00:38:08.408606 master-0 kubenswrapper[23041]: I0308 00:38:08.408351 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerStarted","Data":"192b68dedfebd4dee50599d2c7d025373ab38645d7fe97d9015fca7b54ac5478"} Mar 08 00:38:09.434652 master-0 kubenswrapper[23041]: I0308 00:38:09.434592 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:09.435169 master-0 kubenswrapper[23041]: I0308 00:38:09.434659 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:09.808879 master-0 kubenswrapper[23041]: I0308 00:38:09.808788 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:09.809344 master-0 kubenswrapper[23041]: E0308 00:38:09.809295 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2ab662059bb326d13a07bf5700e4f545)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" Mar 08 00:38:14.456418 master-0 kubenswrapper[23041]: I0308 00:38:14.456297 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:38:14.569074 master-0 kubenswrapper[23041]: I0308 00:38:14.567103 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:38:14.596892 master-0 kubenswrapper[23041]: I0308 00:38:14.591641 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-6474759988-dnw4m"] Mar 08 00:38:14.720057 master-0 kubenswrapper[23041]: I0308 00:38:14.719935 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:38:14.726945 master-0 kubenswrapper[23041]: I0308 00:38:14.726898 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 00:38:14.803489 master-0 kubenswrapper[23041]: I0308 00:38:14.798704 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:38:14.824883 master-0 kubenswrapper[23041]: I0308 00:38:14.824825 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" path="/var/lib/kubelet/pods/0101c4ce-fd58-4ddb-94f7-abb8b2293cdb/volumes" Mar 08 00:38:14.825497 master-0 kubenswrapper[23041]: I0308 00:38:14.825471 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74512190-22e4-4648-8d1e-e487de48a124" path="/var/lib/kubelet/pods/74512190-22e4-4648-8d1e-e487de48a124/volumes" Mar 08 00:38:14.827135 master-0 kubenswrapper[23041]: I0308 00:38:14.827071 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dc96f5b89-ctlsc"] Mar 08 00:38:14.970306 master-0 kubenswrapper[23041]: I0308 00:38:14.966328 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:38:14.970306 master-0 kubenswrapper[23041]: I0308 00:38:14.969555 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 00:38:15.990316 master-0 kubenswrapper[23041]: I0308 00:38:15.990175 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 00:38:15.992038 master-0 kubenswrapper[23041]: E0308 00:38:15.991982 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861ba34f-5174-4835-a9b9-dbc5eacd2963" containerName="installer" Mar 08 00:38:15.992092 master-0 kubenswrapper[23041]: I0308 00:38:15.992049 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="861ba34f-5174-4835-a9b9-dbc5eacd2963" containerName="installer" Mar 08 00:38:15.992130 master-0 kubenswrapper[23041]: E0308 00:38:15.992095 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343c30a5-7bf7-49ef-a224-c39ca46a63f1" containerName="installer" Mar 08 00:38:15.992130 master-0 kubenswrapper[23041]: I0308 00:38:15.992114 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="343c30a5-7bf7-49ef-a224-c39ca46a63f1" containerName="installer" Mar 08 00:38:15.992189 master-0 kubenswrapper[23041]: E0308 00:38:15.992159 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" containerName="metrics-server" Mar 08 00:38:15.992189 master-0 kubenswrapper[23041]: I0308 00:38:15.992180 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" containerName="metrics-server" Mar 08 00:38:15.992267 master-0 kubenswrapper[23041]: E0308 00:38:15.992245 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af76e72-367d-4d11-8c55-8758aa5003dd" containerName="installer" Mar 08 00:38:15.992300 master-0 kubenswrapper[23041]: I0308 00:38:15.992267 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af76e72-367d-4d11-8c55-8758aa5003dd" containerName="installer" Mar 08 00:38:15.992333 master-0 kubenswrapper[23041]: E0308 00:38:15.992304 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" Mar 08 00:38:15.992333 master-0 kubenswrapper[23041]: I0308 00:38:15.992321 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" Mar 08 00:38:15.992390 master-0 kubenswrapper[23041]: E0308 00:38:15.992359 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74512190-22e4-4648-8d1e-e487de48a124" containerName="installer" Mar 08 00:38:15.992390 master-0 kubenswrapper[23041]: I0308 00:38:15.992376 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="74512190-22e4-4648-8d1e-e487de48a124" containerName="installer" Mar 08 00:38:15.992448 master-0 kubenswrapper[23041]: E0308 00:38:15.992420 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" containerName="installer" Mar 08 00:38:15.992448 master-0 kubenswrapper[23041]: I0308 00:38:15.992439 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" containerName="installer" Mar 08 00:38:15.992777 master-0 kubenswrapper[23041]: I0308 00:38:15.992728 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="861ba34f-5174-4835-a9b9-dbc5eacd2963" containerName="installer" Mar 08 00:38:15.992814 master-0 kubenswrapper[23041]: I0308 00:38:15.992790 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="24264c1b-97df-4311-b7af-b205ac879381" containerName="console" Mar 08 00:38:15.992878 master-0 kubenswrapper[23041]: I0308 00:38:15.992842 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0101c4ce-fd58-4ddb-94f7-abb8b2293cdb" containerName="metrics-server" Mar 08 00:38:15.992913 master-0 kubenswrapper[23041]: I0308 00:38:15.992881 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="74512190-22e4-4648-8d1e-e487de48a124" containerName="installer" Mar 08 00:38:15.992945 master-0 kubenswrapper[23041]: I0308 00:38:15.992917 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9ee6f7-24ed-44b3-be57-a07a13e9e73b" containerName="installer" Mar 08 00:38:15.992983 master-0 kubenswrapper[23041]: I0308 00:38:15.992958 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="343c30a5-7bf7-49ef-a224-c39ca46a63f1" containerName="installer" Mar 08 00:38:15.993017 master-0 kubenswrapper[23041]: I0308 00:38:15.992996 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af76e72-367d-4d11-8c55-8758aa5003dd" containerName="installer" Mar 08 00:38:15.994402 master-0 kubenswrapper[23041]: I0308 00:38:15.994350 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.006079 master-0 kubenswrapper[23041]: I0308 00:38:16.005789 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 00:38:16.014289 master-0 kubenswrapper[23041]: I0308 00:38:16.014249 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:38:16.014481 master-0 kubenswrapper[23041]: I0308 00:38:16.014423 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rml7" Mar 08 00:38:16.033218 master-0 kubenswrapper[23041]: I0308 00:38:16.033114 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.033389 master-0 kubenswrapper[23041]: I0308 00:38:16.033250 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.033389 master-0 kubenswrapper[23041]: I0308 00:38:16.033323 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.134593 master-0 kubenswrapper[23041]: I0308 00:38:16.134475 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.134909 master-0 kubenswrapper[23041]: I0308 00:38:16.134856 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.135025 master-0 kubenswrapper[23041]: I0308 00:38:16.134985 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.135124 master-0 kubenswrapper[23041]: I0308 00:38:16.135049 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.135180 master-0 kubenswrapper[23041]: I0308 00:38:16.135143 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.149117 master-0 kubenswrapper[23041]: I0308 00:38:16.149062 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.343588 master-0 kubenswrapper[23041]: I0308 00:38:16.343509 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:16.827740 master-0 kubenswrapper[23041]: I0308 00:38:16.827674 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24264c1b-97df-4311-b7af-b205ac879381" path="/var/lib/kubelet/pods/24264c1b-97df-4311-b7af-b205ac879381/volumes" Mar 08 00:38:16.828752 master-0 kubenswrapper[23041]: I0308 00:38:16.828730 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861ba34f-5174-4835-a9b9-dbc5eacd2963" path="/var/lib/kubelet/pods/861ba34f-5174-4835-a9b9-dbc5eacd2963/volumes" Mar 08 00:38:16.841509 master-0 kubenswrapper[23041]: I0308 00:38:16.834838 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 00:38:17.173228 master-0 kubenswrapper[23041]: I0308 00:38:17.172260 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:38:17.173228 master-0 kubenswrapper[23041]: I0308 00:38:17.172337 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:38:17.173228 master-0 kubenswrapper[23041]: I0308 00:38:17.172732 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:17.173228 master-0 kubenswrapper[23041]: I0308 00:38:17.172792 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:17.494827 master-0 kubenswrapper[23041]: I0308 00:38:17.494763 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb","Type":"ContainerStarted","Data":"fb31a43ed3ac7714b18541cf7111615372c52c8cfe7bcdd50ef03f7df7aeec3a"} Mar 08 00:38:17.494827 master-0 kubenswrapper[23041]: I0308 00:38:17.494817 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb","Type":"ContainerStarted","Data":"8649b69decd2e3bce5880b3dba0a5e94005a51a39fa0f19e17c8e0fc08efadb7"} Mar 08 00:38:17.584335 master-0 kubenswrapper[23041]: I0308 00:38:17.584134 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=2.5840976319999998 podStartE2EDuration="2.584097632s" podCreationTimestamp="2026-03-08 00:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:38:17.581768645 +0000 UTC m=+403.054605199" watchObservedRunningTime="2026-03-08 00:38:17.584097632 +0000 UTC m=+403.056934226" Mar 08 00:38:19.434818 master-0 kubenswrapper[23041]: I0308 00:38:19.434746 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:19.435609 master-0 kubenswrapper[23041]: I0308 00:38:19.434824 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:21.808364 master-0 kubenswrapper[23041]: I0308 00:38:21.808325 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:22.532598 master-0 kubenswrapper[23041]: I0308 00:38:22.532542 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/3.log" Mar 08 00:38:22.533746 master-0 kubenswrapper[23041]: I0308 00:38:22.533702 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:38:22.534560 master-0 kubenswrapper[23041]: I0308 00:38:22.534503 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ab662059bb326d13a07bf5700e4f545","Type":"ContainerStarted","Data":"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a"} Mar 08 00:38:22.693284 master-0 kubenswrapper[23041]: I0308 00:38:22.693197 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:22.693284 master-0 kubenswrapper[23041]: I0308 00:38:22.693281 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:22.828643 master-0 kubenswrapper[23041]: I0308 00:38:22.828453 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:27.173018 master-0 kubenswrapper[23041]: I0308 00:38:27.172928 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:27.173955 master-0 kubenswrapper[23041]: I0308 00:38:27.173020 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:29.435859 master-0 kubenswrapper[23041]: I0308 00:38:29.435729 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:29.437027 master-0 kubenswrapper[23041]: I0308 00:38:29.435870 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:29.437027 master-0 kubenswrapper[23041]: I0308 00:38:29.435961 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:38:29.437323 master-0 kubenswrapper[23041]: I0308 00:38:29.437250 23041 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24"} pod="openshift-console/console-c45bf598-vngbg" containerMessage="Container console failed startup probe, will be restarted" Mar 08 00:38:30.130505 master-0 kubenswrapper[23041]: E0308 00:38:30.130403 23041 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-c45bf598-vngbg" message="" Mar 08 00:38:30.130505 master-0 kubenswrapper[23041]: E0308 00:38:30.130485 23041 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" containerID="cri-o://54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24" Mar 08 00:38:30.131810 master-0 kubenswrapper[23041]: I0308 00:38:30.130562 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" containerID="cri-o://54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24" gracePeriod=40 Mar 08 00:38:30.639174 master-0 kubenswrapper[23041]: I0308 00:38:30.639062 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c45bf598-vngbg_4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/console/0.log" Mar 08 00:38:30.639174 master-0 kubenswrapper[23041]: I0308 00:38:30.639154 23041 generic.go:334] "Generic (PLEG): container finished" podID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerID="54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24" exitCode=255 Mar 08 00:38:30.640517 master-0 kubenswrapper[23041]: I0308 00:38:30.639222 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerDied","Data":"54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24"} Mar 08 00:38:30.640517 master-0 kubenswrapper[23041]: I0308 00:38:30.639277 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerStarted","Data":"31c1ce47271da149372933b5669e24882a32895d1eabd2caa8d72dcabb6291e0"} Mar 08 00:38:31.173004 master-0 kubenswrapper[23041]: I0308 00:38:31.172677 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 00:38:31.175739 master-0 kubenswrapper[23041]: I0308 00:38:31.175672 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.182991 master-0 kubenswrapper[23041]: I0308 00:38:31.182905 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-4dxb7" Mar 08 00:38:31.183256 master-0 kubenswrapper[23041]: I0308 00:38:31.183033 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:38:31.203519 master-0 kubenswrapper[23041]: I0308 00:38:31.202818 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 00:38:31.298406 master-0 kubenswrapper[23041]: I0308 00:38:31.298322 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.298754 master-0 kubenswrapper[23041]: I0308 00:38:31.298521 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.298754 master-0 kubenswrapper[23041]: I0308 00:38:31.298566 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.401512 master-0 kubenswrapper[23041]: I0308 00:38:31.401425 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.401512 master-0 kubenswrapper[23041]: I0308 00:38:31.401555 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.402048 master-0 kubenswrapper[23041]: I0308 00:38:31.401578 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.402048 master-0 kubenswrapper[23041]: I0308 00:38:31.401679 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.402176 master-0 kubenswrapper[23041]: I0308 00:38:31.402132 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.418865 master-0 kubenswrapper[23041]: I0308 00:38:31.418816 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:31.521120 master-0 kubenswrapper[23041]: I0308 00:38:31.521011 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:38:32.033250 master-0 kubenswrapper[23041]: I0308 00:38:32.031786 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-retry-1-master-0"] Mar 08 00:38:32.035329 master-0 kubenswrapper[23041]: I0308 00:38:32.035270 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.038412 master-0 kubenswrapper[23041]: I0308 00:38:32.038338 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 00:38:32.038585 master-0 kubenswrapper[23041]: I0308 00:38:32.038540 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-q6gf6" Mar 08 00:38:32.052802 master-0 kubenswrapper[23041]: I0308 00:38:32.052694 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-retry-1-master-0"] Mar 08 00:38:32.058278 master-0 kubenswrapper[23041]: I0308 00:38:32.058168 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 00:38:32.124724 master-0 kubenswrapper[23041]: I0308 00:38:32.124578 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.125076 master-0 kubenswrapper[23041]: I0308 00:38:32.124749 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.125076 master-0 kubenswrapper[23041]: I0308 00:38:32.124856 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.226796 master-0 kubenswrapper[23041]: I0308 00:38:32.226682 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.227361 master-0 kubenswrapper[23041]: I0308 00:38:32.226868 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.227361 master-0 kubenswrapper[23041]: I0308 00:38:32.226915 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.227361 master-0 kubenswrapper[23041]: I0308 00:38:32.227037 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.227361 master-0 kubenswrapper[23041]: I0308 00:38:32.227097 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.249466 master-0 kubenswrapper[23041]: I0308 00:38:32.249410 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.426827 master-0 kubenswrapper[23041]: I0308 00:38:32.426700 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:38:32.698488 master-0 kubenswrapper[23041]: I0308 00:38:32.698424 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:32.737326 master-0 kubenswrapper[23041]: I0308 00:38:32.736499 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"53c386ff-5ff0-4937-b909-5f800abdb600","Type":"ContainerStarted","Data":"5df41709079702c50acb60779330e91db01063d536dd6d33a7f4ae625ec12bfb"} Mar 08 00:38:32.737326 master-0 kubenswrapper[23041]: I0308 00:38:32.736579 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"53c386ff-5ff0-4937-b909-5f800abdb600","Type":"ContainerStarted","Data":"7893efb17bac0a7e0aafa5d282c528c161bb44feaebdb985d473fcd2ef95b3cf"} Mar 08 00:38:32.774123 master-0 kubenswrapper[23041]: I0308 00:38:32.773971 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" podStartSLOduration=1.773939633 podStartE2EDuration="1.773939633s" podCreationTimestamp="2026-03-08 00:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:38:32.771690797 +0000 UTC m=+418.244527361" watchObservedRunningTime="2026-03-08 00:38:32.773939633 +0000 UTC m=+418.246776197" Mar 08 00:38:32.959160 master-0 kubenswrapper[23041]: I0308 00:38:32.959111 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-retry-1-master-0"] Mar 08 00:38:32.968158 master-0 kubenswrapper[23041]: W0308 00:38:32.968109 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc6c5913d_e562_49ea_a5cd_e6ad1d7fbdbd.slice/crio-3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7 WatchSource:0}: Error finding container 3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7: Status 404 returned error can't find the container with id 3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7 Mar 08 00:38:33.749771 master-0 kubenswrapper[23041]: I0308 00:38:33.749706 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" event={"ID":"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd","Type":"ContainerStarted","Data":"11fecd12a25d5b03abbc9351dc9e60df62188f7ae3672fa56686080f6699546c"} Mar 08 00:38:33.749771 master-0 kubenswrapper[23041]: I0308 00:38:33.749768 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" event={"ID":"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd","Type":"ContainerStarted","Data":"3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7"} Mar 08 00:38:33.776012 master-0 kubenswrapper[23041]: I0308 00:38:33.775452 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" podStartSLOduration=1.775425861 podStartE2EDuration="1.775425861s" podCreationTimestamp="2026-03-08 00:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:38:33.773375081 +0000 UTC m=+419.246211675" watchObservedRunningTime="2026-03-08 00:38:33.775425861 +0000 UTC m=+419.248262425" Mar 08 00:38:35.325252 master-0 kubenswrapper[23041]: I0308 00:38:35.325154 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-phgxj" Mar 08 00:38:37.172759 master-0 kubenswrapper[23041]: I0308 00:38:37.172647 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:37.172759 master-0 kubenswrapper[23041]: I0308 00:38:37.172725 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:39.434835 master-0 kubenswrapper[23041]: I0308 00:38:39.434775 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:38:39.434835 master-0 kubenswrapper[23041]: I0308 00:38:39.434834 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:38:39.435556 master-0 kubenswrapper[23041]: I0308 00:38:39.435141 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:39.435556 master-0 kubenswrapper[23041]: I0308 00:38:39.435254 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:47.172820 master-0 kubenswrapper[23041]: I0308 00:38:47.172758 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:47.173475 master-0 kubenswrapper[23041]: I0308 00:38:47.172833 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:49.434475 master-0 kubenswrapper[23041]: I0308 00:38:49.434411 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:49.435007 master-0 kubenswrapper[23041]: I0308 00:38:49.434496 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:38:50.056872 master-0 kubenswrapper[23041]: I0308 00:38:50.056826 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:38:50.057176 master-0 kubenswrapper[23041]: I0308 00:38:50.057125 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" gracePeriod=30 Mar 08 00:38:50.057272 master-0 kubenswrapper[23041]: I0308 00:38:50.057182 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" containerID="cri-o://2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" gracePeriod=30 Mar 08 00:38:50.057343 master-0 kubenswrapper[23041]: I0308 00:38:50.057274 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" containerID="cri-o://a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" gracePeriod=30 Mar 08 00:38:50.057343 master-0 kubenswrapper[23041]: I0308 00:38:50.057225 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" gracePeriod=30 Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.058952 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059298 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059312 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059322 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-recovery-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059329 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-recovery-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059340 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059348 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059360 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059368 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059383 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059391 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059400 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059407 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059423 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-cert-syncer" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059430 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-cert-syncer" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059457 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059464 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059596 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-recovery-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059604 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059622 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager-cert-syncer" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059637 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059658 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059667 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059679 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059690 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059808 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059817 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: E0308 00:38:50.059836 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059842 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059971 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="kube-controller-manager" Mar 08 00:38:50.060944 master-0 kubenswrapper[23041]: I0308 00:38:50.059997 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab662059bb326d13a07bf5700e4f545" containerName="cluster-policy-controller" Mar 08 00:38:50.091882 master-0 kubenswrapper[23041]: I0308 00:38:50.091837 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.092374 master-0 kubenswrapper[23041]: I0308 00:38:50.092355 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.194111 master-0 kubenswrapper[23041]: I0308 00:38:50.194051 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.194224 master-0 kubenswrapper[23041]: I0308 00:38:50.194124 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.194279 master-0 kubenswrapper[23041]: I0308 00:38:50.194243 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.194327 master-0 kubenswrapper[23041]: I0308 00:38:50.194286 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36794fe98525730e06c774f84687b7f3-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"36794fe98525730e06c774f84687b7f3\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.232931 master-0 kubenswrapper[23041]: I0308 00:38:50.232866 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/3.log" Mar 08 00:38:50.234038 master-0 kubenswrapper[23041]: I0308 00:38:50.234004 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:38:50.234955 master-0 kubenswrapper[23041]: I0308 00:38:50.234914 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager-cert-syncer/0.log" Mar 08 00:38:50.235046 master-0 kubenswrapper[23041]: I0308 00:38:50.235019 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.238752 master-0 kubenswrapper[23041]: I0308 00:38:50.238651 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2ab662059bb326d13a07bf5700e4f545" podUID="36794fe98525730e06c774f84687b7f3" Mar 08 00:38:50.294991 master-0 kubenswrapper[23041]: I0308 00:38:50.294935 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir\") pod \"2ab662059bb326d13a07bf5700e4f545\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " Mar 08 00:38:50.295230 master-0 kubenswrapper[23041]: I0308 00:38:50.295013 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir\") pod \"2ab662059bb326d13a07bf5700e4f545\" (UID: \"2ab662059bb326d13a07bf5700e4f545\") " Mar 08 00:38:50.295230 master-0 kubenswrapper[23041]: I0308 00:38:50.295071 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "2ab662059bb326d13a07bf5700e4f545" (UID: "2ab662059bb326d13a07bf5700e4f545"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:38:50.295230 master-0 kubenswrapper[23041]: I0308 00:38:50.295177 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2ab662059bb326d13a07bf5700e4f545" (UID: "2ab662059bb326d13a07bf5700e4f545"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:38:50.295350 master-0 kubenswrapper[23041]: I0308 00:38:50.295284 23041 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:38:50.396765 master-0 kubenswrapper[23041]: I0308 00:38:50.396625 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ab662059bb326d13a07bf5700e4f545-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:38:50.817106 master-0 kubenswrapper[23041]: I0308 00:38:50.817051 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab662059bb326d13a07bf5700e4f545" path="/var/lib/kubelet/pods/2ab662059bb326d13a07bf5700e4f545/volumes" Mar 08 00:38:50.886054 master-0 kubenswrapper[23041]: I0308 00:38:50.886005 23041 generic.go:334] "Generic (PLEG): container finished" podID="cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" containerID="fb31a43ed3ac7714b18541cf7111615372c52c8cfe7bcdd50ef03f7df7aeec3a" exitCode=0 Mar 08 00:38:50.886054 master-0 kubenswrapper[23041]: I0308 00:38:50.886051 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb","Type":"ContainerDied","Data":"fb31a43ed3ac7714b18541cf7111615372c52c8cfe7bcdd50ef03f7df7aeec3a"} Mar 08 00:38:50.888931 master-0 kubenswrapper[23041]: I0308 00:38:50.888906 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/cluster-policy-controller/3.log" Mar 08 00:38:50.892747 master-0 kubenswrapper[23041]: I0308 00:38:50.892713 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager/1.log" Mar 08 00:38:50.894491 master-0 kubenswrapper[23041]: I0308 00:38:50.894464 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ab662059bb326d13a07bf5700e4f545/kube-controller-manager-cert-syncer/0.log" Mar 08 00:38:50.894565 master-0 kubenswrapper[23041]: I0308 00:38:50.894523 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" exitCode=0 Mar 08 00:38:50.894565 master-0 kubenswrapper[23041]: I0308 00:38:50.894542 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" exitCode=0 Mar 08 00:38:50.894565 master-0 kubenswrapper[23041]: I0308 00:38:50.894553 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" exitCode=0 Mar 08 00:38:50.894565 master-0 kubenswrapper[23041]: I0308 00:38:50.894561 23041 generic.go:334] "Generic (PLEG): container finished" podID="2ab662059bb326d13a07bf5700e4f545" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" exitCode=2 Mar 08 00:38:50.894702 master-0 kubenswrapper[23041]: I0308 00:38:50.894606 23041 scope.go:117] "RemoveContainer" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:50.894702 master-0 kubenswrapper[23041]: I0308 00:38:50.894644 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:38:50.911406 master-0 kubenswrapper[23041]: I0308 00:38:50.911366 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:50.912278 master-0 kubenswrapper[23041]: I0308 00:38:50.912193 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2ab662059bb326d13a07bf5700e4f545" podUID="36794fe98525730e06c774f84687b7f3" Mar 08 00:38:50.927693 master-0 kubenswrapper[23041]: I0308 00:38:50.927662 23041 scope.go:117] "RemoveContainer" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:50.948915 master-0 kubenswrapper[23041]: I0308 00:38:50.948872 23041 scope.go:117] "RemoveContainer" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:50.967842 master-0 kubenswrapper[23041]: I0308 00:38:50.967803 23041 scope.go:117] "RemoveContainer" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:50.986948 master-0 kubenswrapper[23041]: I0308 00:38:50.986853 23041 scope.go:117] "RemoveContainer" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.005754 master-0 kubenswrapper[23041]: I0308 00:38:51.005528 23041 scope.go:117] "RemoveContainer" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:51.006132 master-0 kubenswrapper[23041]: E0308 00:38:51.006088 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": container with ID starting with a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a not found: ID does not exist" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:51.006132 master-0 kubenswrapper[23041]: I0308 00:38:51.006122 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a"} err="failed to get container status \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": rpc error: code = NotFound desc = could not find container \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": container with ID starting with a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a not found: ID does not exist" Mar 08 00:38:51.006327 master-0 kubenswrapper[23041]: I0308 00:38:51.006143 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:51.006555 master-0 kubenswrapper[23041]: E0308 00:38:51.006518 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": container with ID starting with 3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09 not found: ID does not exist" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:51.006555 master-0 kubenswrapper[23041]: I0308 00:38:51.006542 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} err="failed to get container status \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": rpc error: code = NotFound desc = could not find container \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": container with ID starting with 3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09 not found: ID does not exist" Mar 08 00:38:51.006555 master-0 kubenswrapper[23041]: I0308 00:38:51.006556 23041 scope.go:117] "RemoveContainer" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:51.006771 master-0 kubenswrapper[23041]: E0308 00:38:51.006734 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": container with ID starting with 2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b not found: ID does not exist" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:51.006771 master-0 kubenswrapper[23041]: I0308 00:38:51.006758 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b"} err="failed to get container status \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": rpc error: code = NotFound desc = could not find container \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": container with ID starting with 2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b not found: ID does not exist" Mar 08 00:38:51.006771 master-0 kubenswrapper[23041]: I0308 00:38:51.006770 23041 scope.go:117] "RemoveContainer" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:51.007112 master-0 kubenswrapper[23041]: E0308 00:38:51.007077 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": container with ID starting with 9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410 not found: ID does not exist" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:51.007112 master-0 kubenswrapper[23041]: I0308 00:38:51.007102 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} err="failed to get container status \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": rpc error: code = NotFound desc = could not find container \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": container with ID starting with 9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410 not found: ID does not exist" Mar 08 00:38:51.007273 master-0 kubenswrapper[23041]: I0308 00:38:51.007122 23041 scope.go:117] "RemoveContainer" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:51.007356 master-0 kubenswrapper[23041]: E0308 00:38:51.007324 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": container with ID starting with fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36 not found: ID does not exist" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:51.007417 master-0 kubenswrapper[23041]: I0308 00:38:51.007358 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36"} err="failed to get container status \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": rpc error: code = NotFound desc = could not find container \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": container with ID starting with fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36 not found: ID does not exist" Mar 08 00:38:51.007417 master-0 kubenswrapper[23041]: I0308 00:38:51.007377 23041 scope.go:117] "RemoveContainer" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.007642 master-0 kubenswrapper[23041]: E0308 00:38:51.007598 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": container with ID starting with 2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454 not found: ID does not exist" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.007642 master-0 kubenswrapper[23041]: I0308 00:38:51.007631 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454"} err="failed to get container status \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": rpc error: code = NotFound desc = could not find container \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": container with ID starting with 2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454 not found: ID does not exist" Mar 08 00:38:51.007805 master-0 kubenswrapper[23041]: I0308 00:38:51.007649 23041 scope.go:117] "RemoveContainer" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:51.007966 master-0 kubenswrapper[23041]: I0308 00:38:51.007937 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a"} err="failed to get container status \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": rpc error: code = NotFound desc = could not find container \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": container with ID starting with a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a not found: ID does not exist" Mar 08 00:38:51.007966 master-0 kubenswrapper[23041]: I0308 00:38:51.007963 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:51.008564 master-0 kubenswrapper[23041]: I0308 00:38:51.008522 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} err="failed to get container status \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": rpc error: code = NotFound desc = could not find container \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": container with ID starting with 3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09 not found: ID does not exist" Mar 08 00:38:51.008564 master-0 kubenswrapper[23041]: I0308 00:38:51.008550 23041 scope.go:117] "RemoveContainer" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:51.009545 master-0 kubenswrapper[23041]: I0308 00:38:51.009507 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b"} err="failed to get container status \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": rpc error: code = NotFound desc = could not find container \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": container with ID starting with 2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b not found: ID does not exist" Mar 08 00:38:51.009545 master-0 kubenswrapper[23041]: I0308 00:38:51.009528 23041 scope.go:117] "RemoveContainer" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:51.009814 master-0 kubenswrapper[23041]: I0308 00:38:51.009778 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} err="failed to get container status \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": rpc error: code = NotFound desc = could not find container \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": container with ID starting with 9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410 not found: ID does not exist" Mar 08 00:38:51.009814 master-0 kubenswrapper[23041]: I0308 00:38:51.009799 23041 scope.go:117] "RemoveContainer" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:51.010174 master-0 kubenswrapper[23041]: I0308 00:38:51.010101 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36"} err="failed to get container status \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": rpc error: code = NotFound desc = could not find container \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": container with ID starting with fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36 not found: ID does not exist" Mar 08 00:38:51.010174 master-0 kubenswrapper[23041]: I0308 00:38:51.010165 23041 scope.go:117] "RemoveContainer" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.010535 master-0 kubenswrapper[23041]: I0308 00:38:51.010498 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454"} err="failed to get container status \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": rpc error: code = NotFound desc = could not find container \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": container with ID starting with 2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454 not found: ID does not exist" Mar 08 00:38:51.010535 master-0 kubenswrapper[23041]: I0308 00:38:51.010518 23041 scope.go:117] "RemoveContainer" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:51.010778 master-0 kubenswrapper[23041]: I0308 00:38:51.010739 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a"} err="failed to get container status \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": rpc error: code = NotFound desc = could not find container \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": container with ID starting with a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a not found: ID does not exist" Mar 08 00:38:51.010778 master-0 kubenswrapper[23041]: I0308 00:38:51.010764 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:51.011242 master-0 kubenswrapper[23041]: I0308 00:38:51.011177 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} err="failed to get container status \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": rpc error: code = NotFound desc = could not find container \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": container with ID starting with 3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09 not found: ID does not exist" Mar 08 00:38:51.011322 master-0 kubenswrapper[23041]: I0308 00:38:51.011278 23041 scope.go:117] "RemoveContainer" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:51.011523 master-0 kubenswrapper[23041]: I0308 00:38:51.011483 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b"} err="failed to get container status \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": rpc error: code = NotFound desc = could not find container \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": container with ID starting with 2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b not found: ID does not exist" Mar 08 00:38:51.011523 master-0 kubenswrapper[23041]: I0308 00:38:51.011505 23041 scope.go:117] "RemoveContainer" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:51.011735 master-0 kubenswrapper[23041]: I0308 00:38:51.011692 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} err="failed to get container status \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": rpc error: code = NotFound desc = could not find container \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": container with ID starting with 9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410 not found: ID does not exist" Mar 08 00:38:51.011735 master-0 kubenswrapper[23041]: I0308 00:38:51.011716 23041 scope.go:117] "RemoveContainer" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:51.012024 master-0 kubenswrapper[23041]: I0308 00:38:51.011982 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36"} err="failed to get container status \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": rpc error: code = NotFound desc = could not find container \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": container with ID starting with fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36 not found: ID does not exist" Mar 08 00:38:51.012024 master-0 kubenswrapper[23041]: I0308 00:38:51.012012 23041 scope.go:117] "RemoveContainer" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.012563 master-0 kubenswrapper[23041]: I0308 00:38:51.012534 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454"} err="failed to get container status \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": rpc error: code = NotFound desc = could not find container \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": container with ID starting with 2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454 not found: ID does not exist" Mar 08 00:38:51.012563 master-0 kubenswrapper[23041]: I0308 00:38:51.012561 23041 scope.go:117] "RemoveContainer" containerID="a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a" Mar 08 00:38:51.012806 master-0 kubenswrapper[23041]: I0308 00:38:51.012777 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a"} err="failed to get container status \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": rpc error: code = NotFound desc = could not find container \"a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a\": container with ID starting with a84adaf243b849b02923619ad910297394291e7bdee3959d3f87661b7f9d9e1a not found: ID does not exist" Mar 08 00:38:51.012806 master-0 kubenswrapper[23041]: I0308 00:38:51.012804 23041 scope.go:117] "RemoveContainer" containerID="3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09" Mar 08 00:38:51.013080 master-0 kubenswrapper[23041]: I0308 00:38:51.013014 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09"} err="failed to get container status \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": rpc error: code = NotFound desc = could not find container \"3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09\": container with ID starting with 3c8e1ec133c07a7bbcbdb1ac76d720856c26fe6e0b9e0de82b700d0e47ceeb09 not found: ID does not exist" Mar 08 00:38:51.013080 master-0 kubenswrapper[23041]: I0308 00:38:51.013043 23041 scope.go:117] "RemoveContainer" containerID="2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b" Mar 08 00:38:51.013429 master-0 kubenswrapper[23041]: I0308 00:38:51.013386 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b"} err="failed to get container status \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": rpc error: code = NotFound desc = could not find container \"2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b\": container with ID starting with 2bd10a2ea7be92083a6fa078c362bedafbefb6666cce4a0e91ffc2ad0aeb3a3b not found: ID does not exist" Mar 08 00:38:51.013429 master-0 kubenswrapper[23041]: I0308 00:38:51.013417 23041 scope.go:117] "RemoveContainer" containerID="9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410" Mar 08 00:38:51.013715 master-0 kubenswrapper[23041]: I0308 00:38:51.013686 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410"} err="failed to get container status \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": rpc error: code = NotFound desc = could not find container \"9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410\": container with ID starting with 9c34514feba62dbc424465f89255c0d11d4ab193add728281e8a22d2de8c1410 not found: ID does not exist" Mar 08 00:38:51.013715 master-0 kubenswrapper[23041]: I0308 00:38:51.013711 23041 scope.go:117] "RemoveContainer" containerID="fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36" Mar 08 00:38:51.013968 master-0 kubenswrapper[23041]: I0308 00:38:51.013940 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36"} err="failed to get container status \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": rpc error: code = NotFound desc = could not find container \"fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36\": container with ID starting with fd786e7f06eff90b20c4ed46d3ed52bbe237190f752b877bfff1fedbc785fa36 not found: ID does not exist" Mar 08 00:38:51.013968 master-0 kubenswrapper[23041]: I0308 00:38:51.013965 23041 scope.go:117] "RemoveContainer" containerID="2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454" Mar 08 00:38:51.014246 master-0 kubenswrapper[23041]: I0308 00:38:51.014217 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454"} err="failed to get container status \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": rpc error: code = NotFound desc = could not find container \"2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454\": container with ID starting with 2ad65477b3234a5615722a158f160fb5dd52caa55583da9b6b87b3b65a179454 not found: ID does not exist" Mar 08 00:38:52.199194 master-0 kubenswrapper[23041]: I0308 00:38:52.199161 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:52.230009 master-0 kubenswrapper[23041]: I0308 00:38:52.227913 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir\") pod \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " Mar 08 00:38:52.230009 master-0 kubenswrapper[23041]: I0308 00:38:52.228091 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" (UID: "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:38:52.230009 master-0 kubenswrapper[23041]: I0308 00:38:52.228850 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access\") pod \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " Mar 08 00:38:52.230009 master-0 kubenswrapper[23041]: I0308 00:38:52.228887 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock\") pod \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\" (UID: \"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb\") " Mar 08 00:38:52.230009 master-0 kubenswrapper[23041]: I0308 00:38:52.229274 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock" (OuterVolumeSpecName: "var-lock") pod "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" (UID: "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:38:52.232266 master-0 kubenswrapper[23041]: I0308 00:38:52.232169 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" (UID: "cfcc28b5-f88b-4ecf-b503-cf31d00e22eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:38:52.329935 master-0 kubenswrapper[23041]: I0308 00:38:52.329878 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:38:52.330251 master-0 kubenswrapper[23041]: I0308 00:38:52.330233 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:38:52.330372 master-0 kubenswrapper[23041]: I0308 00:38:52.330355 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfcc28b5-f88b-4ecf-b503-cf31d00e22eb-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:38:52.916901 master-0 kubenswrapper[23041]: I0308 00:38:52.916843 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"cfcc28b5-f88b-4ecf-b503-cf31d00e22eb","Type":"ContainerDied","Data":"8649b69decd2e3bce5880b3dba0a5e94005a51a39fa0f19e17c8e0fc08efadb7"} Mar 08 00:38:52.916901 master-0 kubenswrapper[23041]: I0308 00:38:52.916881 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 00:38:52.916901 master-0 kubenswrapper[23041]: I0308 00:38:52.916898 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8649b69decd2e3bce5880b3dba0a5e94005a51a39fa0f19e17c8e0fc08efadb7" Mar 08 00:38:57.173391 master-0 kubenswrapper[23041]: I0308 00:38:57.173294 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:38:57.174596 master-0 kubenswrapper[23041]: I0308 00:38:57.173411 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:38:59.435038 master-0 kubenswrapper[23041]: I0308 00:38:59.434973 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:38:59.435561 master-0 kubenswrapper[23041]: I0308 00:38:59.435047 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:04.753718 master-0 kubenswrapper[23041]: I0308 00:39:04.753595 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:39:04.755145 master-0 kubenswrapper[23041]: I0308 00:39:04.754073 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-recovery-controller" containerID="cri-o://c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" gracePeriod=30 Mar 08 00:39:04.755145 master-0 kubenswrapper[23041]: I0308 00:39:04.754224 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" containerID="cri-o://667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" gracePeriod=30 Mar 08 00:39:04.755145 master-0 kubenswrapper[23041]: I0308 00:39:04.754169 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" containerID="cri-o://21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" gracePeriod=30 Mar 08 00:39:04.756680 master-0 kubenswrapper[23041]: I0308 00:39:04.756574 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:39:04.757510 master-0 kubenswrapper[23041]: E0308 00:39:04.757450 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.757510 master-0 kubenswrapper[23041]: I0308 00:39:04.757504 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: E0308 00:39:04.757530 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" containerName="installer" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: I0308 00:39:04.757549 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" containerName="installer" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: E0308 00:39:04.757580 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: I0308 00:39:04.757597 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: E0308 00:39:04.757629 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: I0308 00:39:04.757644 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: E0308 00:39:04.757680 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: I0308 00:39:04.757698 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.757722 master-0 kubenswrapper[23041]: E0308 00:39:04.757722 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-recovery-controller" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.757760 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-recovery-controller" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: E0308 00:39:04.757827 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="wait-for-host-port" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.757849 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="wait-for-host-port" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.758176 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcc28b5-f88b-4ecf-b503-cf31d00e22eb" containerName="installer" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.758268 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.758308 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.758332 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-recovery-controller" Mar 08 00:39:04.758374 master-0 kubenswrapper[23041]: I0308 00:39:04.758385 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.759066 master-0 kubenswrapper[23041]: I0308 00:39:04.758410 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler-cert-syncer" Mar 08 00:39:04.811790 master-0 kubenswrapper[23041]: I0308 00:39:04.811717 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:04.815884 master-0 kubenswrapper[23041]: I0308 00:39:04.815836 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1453f6461bf5d599ad65a4656343ee91" podUID="aa6a75ab47c06be4e74d05f552da4470" Mar 08 00:39:04.954548 master-0 kubenswrapper[23041]: I0308 00:39:04.954468 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:04.954722 master-0 kubenswrapper[23041]: I0308 00:39:04.954587 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:04.995904 master-0 kubenswrapper[23041]: I0308 00:39:04.995831 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/1.log" Mar 08 00:39:04.996505 master-0 kubenswrapper[23041]: I0308 00:39:04.996463 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3af0e66f-1645-4bb9-8f4a-8471f775852b" Mar 08 00:39:04.996505 master-0 kubenswrapper[23041]: I0308 00:39:04.996506 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3af0e66f-1645-4bb9-8f4a-8471f775852b" Mar 08 00:39:04.999082 master-0 kubenswrapper[23041]: I0308 00:39:04.999052 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/0.log" Mar 08 00:39:05.000583 master-0 kubenswrapper[23041]: I0308 00:39:05.000547 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:39:05.001379 master-0 kubenswrapper[23041]: I0308 00:39:05.001315 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.015500 master-0 kubenswrapper[23041]: I0308 00:39:05.007909 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1453f6461bf5d599ad65a4656343ee91" podUID="aa6a75ab47c06be4e74d05f552da4470" Mar 08 00:39:05.017037 master-0 kubenswrapper[23041]: I0308 00:39:05.016989 23041 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:05.026294 master-0 kubenswrapper[23041]: I0308 00:39:05.025630 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:39:05.029964 master-0 kubenswrapper[23041]: I0308 00:39:05.029897 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:39:05.051458 master-0 kubenswrapper[23041]: I0308 00:39:05.051380 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:05.055311 master-0 kubenswrapper[23041]: I0308 00:39:05.052776 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/1.log" Mar 08 00:39:05.057868 master-0 kubenswrapper[23041]: I0308 00:39:05.056993 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.057868 master-0 kubenswrapper[23041]: I0308 00:39:05.057034 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.057868 master-0 kubenswrapper[23041]: I0308 00:39:05.057218 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.057868 master-0 kubenswrapper[23041]: I0308 00:39:05.057358 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.060093 master-0 kubenswrapper[23041]: I0308 00:39:05.060020 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 00:39:05.060532 master-0 kubenswrapper[23041]: I0308 00:39:05.060497 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler-cert-syncer/0.log" Mar 08 00:39:05.069495 master-0 kubenswrapper[23041]: I0308 00:39:05.069404 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 08 00:39:05.070484 master-0 kubenswrapper[23041]: I0308 00:39:05.070415 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" exitCode=2 Mar 08 00:39:05.070484 master-0 kubenswrapper[23041]: I0308 00:39:05.070467 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" exitCode=0 Mar 08 00:39:05.070484 master-0 kubenswrapper[23041]: I0308 00:39:05.070488 23041 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" exitCode=0 Mar 08 00:39:05.070771 master-0 kubenswrapper[23041]: I0308 00:39:05.070681 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:05.072187 master-0 kubenswrapper[23041]: I0308 00:39:05.071583 23041 scope.go:117] "RemoveContainer" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" Mar 08 00:39:05.077385 master-0 kubenswrapper[23041]: I0308 00:39:05.075282 23041 generic.go:334] "Generic (PLEG): container finished" podID="c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" containerID="11fecd12a25d5b03abbc9351dc9e60df62188f7ae3672fa56686080f6699546c" exitCode=0 Mar 08 00:39:05.077385 master-0 kubenswrapper[23041]: I0308 00:39:05.075353 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" event={"ID":"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd","Type":"ContainerDied","Data":"11fecd12a25d5b03abbc9351dc9e60df62188f7ae3672fa56686080f6699546c"} Mar 08 00:39:05.115355 master-0 kubenswrapper[23041]: I0308 00:39:05.114253 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1453f6461bf5d599ad65a4656343ee91" podUID="aa6a75ab47c06be4e74d05f552da4470" Mar 08 00:39:05.158354 master-0 kubenswrapper[23041]: I0308 00:39:05.158257 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"1453f6461bf5d599ad65a4656343ee91\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " Mar 08 00:39:05.158557 master-0 kubenswrapper[23041]: I0308 00:39:05.158377 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"1453f6461bf5d599ad65a4656343ee91\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " Mar 08 00:39:05.159370 master-0 kubenswrapper[23041]: I0308 00:39:05.159330 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "1453f6461bf5d599ad65a4656343ee91" (UID: "1453f6461bf5d599ad65a4656343ee91"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:05.159426 master-0 kubenswrapper[23041]: I0308 00:39:05.159380 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "1453f6461bf5d599ad65a4656343ee91" (UID: "1453f6461bf5d599ad65a4656343ee91"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:05.159559 master-0 kubenswrapper[23041]: I0308 00:39:05.159523 23041 scope.go:117] "RemoveContainer" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" Mar 08 00:39:05.185939 master-0 kubenswrapper[23041]: I0308 00:39:05.185889 23041 scope.go:117] "RemoveContainer" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" Mar 08 00:39:05.211614 master-0 kubenswrapper[23041]: I0308 00:39:05.210843 23041 scope.go:117] "RemoveContainer" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:39:05.246225 master-0 kubenswrapper[23041]: I0308 00:39:05.243274 23041 scope.go:117] "RemoveContainer" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:39:05.260446 master-0 kubenswrapper[23041]: I0308 00:39:05.260368 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:05.260446 master-0 kubenswrapper[23041]: I0308 00:39:05.260420 23041 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:05.338827 master-0 kubenswrapper[23041]: I0308 00:39:05.338691 23041 scope.go:117] "RemoveContainer" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" Mar 08 00:39:05.378674 master-0 kubenswrapper[23041]: I0308 00:39:05.378350 23041 scope.go:117] "RemoveContainer" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.378956 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": container with ID starting with 21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f not found: ID does not exist" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.378984 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f"} err="failed to get container status \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": rpc error: code = NotFound desc = could not find container \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": container with ID starting with 21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.379009 23041 scope.go:117] "RemoveContainer" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.379363 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": container with ID starting with 667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66 not found: ID does not exist" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.379385 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66"} err="failed to get container status \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": rpc error: code = NotFound desc = could not find container \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": container with ID starting with 667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66 not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.379401 23041 scope.go:117] "RemoveContainer" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.379731 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": container with ID starting with c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b not found: ID does not exist" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.379756 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b"} err="failed to get container status \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": rpc error: code = NotFound desc = could not find container \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": container with ID starting with c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.379770 23041 scope.go:117] "RemoveContainer" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.380006 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": container with ID starting with 4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64 not found: ID does not exist" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380028 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} err="failed to get container status \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": rpc error: code = NotFound desc = could not find container \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": container with ID starting with 4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64 not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380045 23041 scope.go:117] "RemoveContainer" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.380367 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": container with ID starting with 2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d not found: ID does not exist" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380383 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} err="failed to get container status \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": rpc error: code = NotFound desc = could not find container \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": container with ID starting with 2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380399 23041 scope.go:117] "RemoveContainer" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: E0308 00:39:05.380614 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": container with ID starting with 16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d not found: ID does not exist" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380628 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d"} err="failed to get container status \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": rpc error: code = NotFound desc = could not find container \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": container with ID starting with 16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380641 23041 scope.go:117] "RemoveContainer" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380799 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f"} err="failed to get container status \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": rpc error: code = NotFound desc = could not find container \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": container with ID starting with 21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380816 23041 scope.go:117] "RemoveContainer" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380970 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66"} err="failed to get container status \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": rpc error: code = NotFound desc = could not find container \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": container with ID starting with 667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66 not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.380984 23041 scope.go:117] "RemoveContainer" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.381140 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b"} err="failed to get container status \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": rpc error: code = NotFound desc = could not find container \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": container with ID starting with c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b not found: ID does not exist" Mar 08 00:39:05.381445 master-0 kubenswrapper[23041]: I0308 00:39:05.381156 23041 scope.go:117] "RemoveContainer" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:39:05.382413 master-0 kubenswrapper[23041]: I0308 00:39:05.381496 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} err="failed to get container status \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": rpc error: code = NotFound desc = could not find container \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": container with ID starting with 4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64 not found: ID does not exist" Mar 08 00:39:05.382413 master-0 kubenswrapper[23041]: I0308 00:39:05.381509 23041 scope.go:117] "RemoveContainer" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:39:05.382413 master-0 kubenswrapper[23041]: I0308 00:39:05.381695 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} err="failed to get container status \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": rpc error: code = NotFound desc = could not find container \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": container with ID starting with 2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d not found: ID does not exist" Mar 08 00:39:05.382413 master-0 kubenswrapper[23041]: I0308 00:39:05.381709 23041 scope.go:117] "RemoveContainer" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" Mar 08 00:39:05.382722 master-0 kubenswrapper[23041]: I0308 00:39:05.382684 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d"} err="failed to get container status \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": rpc error: code = NotFound desc = could not find container \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": container with ID starting with 16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d not found: ID does not exist" Mar 08 00:39:05.382722 master-0 kubenswrapper[23041]: I0308 00:39:05.382714 23041 scope.go:117] "RemoveContainer" containerID="21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f" Mar 08 00:39:05.383061 master-0 kubenswrapper[23041]: I0308 00:39:05.383024 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f"} err="failed to get container status \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": rpc error: code = NotFound desc = could not find container \"21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f\": container with ID starting with 21854ab4700806c891145dd8952eabc13b1e6e2a7204f749ddb9d22649d43a2f not found: ID does not exist" Mar 08 00:39:05.383061 master-0 kubenswrapper[23041]: I0308 00:39:05.383051 23041 scope.go:117] "RemoveContainer" containerID="667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66" Mar 08 00:39:05.389130 master-0 kubenswrapper[23041]: I0308 00:39:05.389098 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66"} err="failed to get container status \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": rpc error: code = NotFound desc = could not find container \"667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66\": container with ID starting with 667ebe7e35ed617b3ed91aceb47b77ab1ed599b695eddc56d9628db5ec3e0c66 not found: ID does not exist" Mar 08 00:39:05.389232 master-0 kubenswrapper[23041]: I0308 00:39:05.389139 23041 scope.go:117] "RemoveContainer" containerID="c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b" Mar 08 00:39:05.389441 master-0 kubenswrapper[23041]: I0308 00:39:05.389412 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b"} err="failed to get container status \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": rpc error: code = NotFound desc = could not find container \"c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b\": container with ID starting with c01e48ad99b01d18f3c32d8971fb8a634df39b838fcb697c02d699ac7e0bf59b not found: ID does not exist" Mar 08 00:39:05.389441 master-0 kubenswrapper[23041]: I0308 00:39:05.389431 23041 scope.go:117] "RemoveContainer" containerID="4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64" Mar 08 00:39:05.389773 master-0 kubenswrapper[23041]: I0308 00:39:05.389740 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64"} err="failed to get container status \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": rpc error: code = NotFound desc = could not find container \"4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64\": container with ID starting with 4a9781cd54b6849919a2e1ded759e631816b24203f18a3cce8ca11053a994a64 not found: ID does not exist" Mar 08 00:39:05.389773 master-0 kubenswrapper[23041]: I0308 00:39:05.389765 23041 scope.go:117] "RemoveContainer" containerID="2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d" Mar 08 00:39:05.390062 master-0 kubenswrapper[23041]: I0308 00:39:05.390033 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d"} err="failed to get container status \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": rpc error: code = NotFound desc = could not find container \"2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d\": container with ID starting with 2bda97f02cc22c73814013d78c2e90a28eb3ed0437db127445efbed0e90aa23d not found: ID does not exist" Mar 08 00:39:05.390114 master-0 kubenswrapper[23041]: I0308 00:39:05.390054 23041 scope.go:117] "RemoveContainer" containerID="16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d" Mar 08 00:39:05.390605 master-0 kubenswrapper[23041]: I0308 00:39:05.390555 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d"} err="failed to get container status \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": rpc error: code = NotFound desc = could not find container \"16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d\": container with ID starting with 16143328d55448f305f6ab28c116011527d147a9f464f1696ddaa4f87b24902d not found: ID does not exist" Mar 08 00:39:05.411456 master-0 kubenswrapper[23041]: I0308 00:39:05.408308 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1453f6461bf5d599ad65a4656343ee91" podUID="aa6a75ab47c06be4e74d05f552da4470" Mar 08 00:39:06.087426 master-0 kubenswrapper[23041]: I0308 00:39:06.087365 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"36794fe98525730e06c774f84687b7f3","Type":"ContainerStarted","Data":"46f94c6696da3cdb18fafdd01de081b1425db74d841c35f29dddaa722de5de25"} Mar 08 00:39:06.087426 master-0 kubenswrapper[23041]: I0308 00:39:06.087423 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"36794fe98525730e06c774f84687b7f3","Type":"ContainerStarted","Data":"b892bd31ce0010be8f8533b886ae6082e69d4ae65694d6c988fea1a5f65b77f0"} Mar 08 00:39:06.087426 master-0 kubenswrapper[23041]: I0308 00:39:06.087433 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"36794fe98525730e06c774f84687b7f3","Type":"ContainerStarted","Data":"88166af01937fc7588c2e907f9099d26e72a3000309f32239505c77ae6b8ec56"} Mar 08 00:39:06.088089 master-0 kubenswrapper[23041]: I0308 00:39:06.087443 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"36794fe98525730e06c774f84687b7f3","Type":"ContainerStarted","Data":"de6d8a79c51ac59742e42e2de7927045e67e92d99c5dddbd965ceb3242e8ae2d"} Mar 08 00:39:06.443119 master-0 kubenswrapper[23041]: I0308 00:39:06.443070 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:39:06.485061 master-0 kubenswrapper[23041]: I0308 00:39:06.485003 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access\") pod \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " Mar 08 00:39:06.485330 master-0 kubenswrapper[23041]: I0308 00:39:06.485073 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir\") pod \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " Mar 08 00:39:06.485330 master-0 kubenswrapper[23041]: I0308 00:39:06.485102 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock\") pod \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\" (UID: \"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd\") " Mar 08 00:39:06.485491 master-0 kubenswrapper[23041]: I0308 00:39:06.485457 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock" (OuterVolumeSpecName: "var-lock") pod "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" (UID: "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:06.486000 master-0 kubenswrapper[23041]: I0308 00:39:06.485966 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" (UID: "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:06.488916 master-0 kubenswrapper[23041]: I0308 00:39:06.488851 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" (UID: "c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:39:06.587747 master-0 kubenswrapper[23041]: I0308 00:39:06.587610 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:06.587747 master-0 kubenswrapper[23041]: I0308 00:39:06.587648 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:06.587747 master-0 kubenswrapper[23041]: I0308 00:39:06.587673 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:06.819980 master-0 kubenswrapper[23041]: I0308 00:39:06.819305 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1453f6461bf5d599ad65a4656343ee91" path="/var/lib/kubelet/pods/1453f6461bf5d599ad65a4656343ee91/volumes" Mar 08 00:39:07.102315 master-0 kubenswrapper[23041]: I0308 00:39:07.102089 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" event={"ID":"c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd","Type":"ContainerDied","Data":"3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7"} Mar 08 00:39:07.102315 master-0 kubenswrapper[23041]: I0308 00:39:07.102173 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fbdbe705bdf8599d5669911e9d4c104357ab5d6c3d9003deaa90ba5992a3cd7" Mar 08 00:39:07.102315 master-0 kubenswrapper[23041]: I0308 00:39:07.102120 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-retry-1-master-0" Mar 08 00:39:07.105872 master-0 kubenswrapper[23041]: I0308 00:39:07.105783 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"36794fe98525730e06c774f84687b7f3","Type":"ContainerStarted","Data":"56e004b899ae83febf6a0424b5ef7aeba337324fe7c35db6f7a3abf5d009e1cd"} Mar 08 00:39:07.134907 master-0 kubenswrapper[23041]: I0308 00:39:07.134804 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.134783087 podStartE2EDuration="2.134783087s" podCreationTimestamp="2026-03-08 00:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:39:07.131792484 +0000 UTC m=+452.604629118" watchObservedRunningTime="2026-03-08 00:39:07.134783087 +0000 UTC m=+452.607619661" Mar 08 00:39:07.173818 master-0 kubenswrapper[23041]: I0308 00:39:07.173749 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:07.174149 master-0 kubenswrapper[23041]: I0308 00:39:07.173823 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:09.434364 master-0 kubenswrapper[23041]: I0308 00:39:09.434286 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:09.434364 master-0 kubenswrapper[23041]: I0308 00:39:09.434361 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:09.808191 master-0 kubenswrapper[23041]: I0308 00:39:09.808136 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:39:09.808191 master-0 kubenswrapper[23041]: I0308 00:39:09.808184 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:39:09.826318 master-0 kubenswrapper[23041]: I0308 00:39:09.826250 23041 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 08 00:39:09.826714 master-0 kubenswrapper[23041]: I0308 00:39:09.826664 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:39:09.833026 master-0 kubenswrapper[23041]: I0308 00:39:09.832972 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:39:09.854792 master-0 kubenswrapper[23041]: I0308 00:39:09.854736 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 00:39:10.142608 master-0 kubenswrapper[23041]: I0308 00:39:10.142461 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:39:10.142962 master-0 kubenswrapper[23041]: I0308 00:39:10.142917 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3263f927-5b7c-41e5-98b8-533a08784cb3" Mar 08 00:39:10.535766 master-0 kubenswrapper[23041]: I0308 00:39:10.535670 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:39:10.537060 master-0 kubenswrapper[23041]: E0308 00:39:10.536433 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" containerName="installer" Mar 08 00:39:10.537060 master-0 kubenswrapper[23041]: I0308 00:39:10.536461 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" containerName="installer" Mar 08 00:39:10.537060 master-0 kubenswrapper[23041]: I0308 00:39:10.536704 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c5913d-e562-49ea-a5cd-e6ad1d7fbdbd" containerName="installer" Mar 08 00:39:10.537785 master-0 kubenswrapper[23041]: I0308 00:39:10.537738 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.542304 master-0 kubenswrapper[23041]: I0308 00:39:10.542190 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:39:10.543118 master-0 kubenswrapper[23041]: I0308 00:39:10.543020 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15" gracePeriod=15 Mar 08 00:39:10.543611 master-0 kubenswrapper[23041]: I0308 00:39:10.543563 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc" gracePeriod=15 Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.543699 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924" gracePeriod=15 Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.543685 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c" gracePeriod=15 Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.544144 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac" gracePeriod=15 Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545101 23041 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545623 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545642 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545690 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545700 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545731 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545740 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545759 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545766 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545778 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545784 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: E0308 00:39:10.545800 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545808 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.545973 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.546007 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.546027 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.546044 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.546058 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.546068 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.550887 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.550992 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551132 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551172 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551248 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551282 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551322 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.552780 master-0 kubenswrapper[23041]: I0308 00:39:10.551360 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.601526 master-0 kubenswrapper[23041]: I0308 00:39:10.600624 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.6006036940000001 podStartE2EDuration="1.600603694s" podCreationTimestamp="2026-03-08 00:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:39:10.597475578 +0000 UTC m=+456.070312142" watchObservedRunningTime="2026-03-08 00:39:10.600603694 +0000 UTC m=+456.073440248" Mar 08 00:39:10.624229 master-0 kubenswrapper[23041]: I0308 00:39:10.622593 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:39:10.652970 master-0 kubenswrapper[23041]: I0308 00:39:10.652867 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.652970 master-0 kubenswrapper[23041]: I0308 00:39:10.652919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.652970 master-0 kubenswrapper[23041]: I0308 00:39:10.652946 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653151 master-0 kubenswrapper[23041]: I0308 00:39:10.652981 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653151 master-0 kubenswrapper[23041]: I0308 00:39:10.653005 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653151 master-0 kubenswrapper[23041]: I0308 00:39:10.653028 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653151 master-0 kubenswrapper[23041]: I0308 00:39:10.653085 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653151 master-0 kubenswrapper[23041]: I0308 00:39:10.653121 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653162 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653243 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653255 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653286 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653311 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653267 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653393 master-0 kubenswrapper[23041]: I0308 00:39:10.653356 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.653747 master-0 kubenswrapper[23041]: I0308 00:39:10.653403 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.912357 master-0 kubenswrapper[23041]: I0308 00:39:10.912069 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:10.939771 master-0 kubenswrapper[23041]: W0308 00:39:10.938049 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda814bd60de133d95cf99630a978c017e.slice/crio-e117c5d376f39196447b92f6a93d68f1523d06b0ca339f10b530b14fcff3ceda WatchSource:0}: Error finding container e117c5d376f39196447b92f6a93d68f1523d06b0ca339f10b530b14fcff3ceda: Status 404 returned error can't find the container with id e117c5d376f39196447b92f6a93d68f1523d06b0ca339f10b530b14fcff3ceda Mar 08 00:39:10.943077 master-0 kubenswrapper[23041]: E0308 00:39:10.942940 23041 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189ab6c2c3665fdc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,LastTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:39:11.152052 master-0 kubenswrapper[23041]: I0308 00:39:11.151972 23041 generic.go:334] "Generic (PLEG): container finished" podID="53c386ff-5ff0-4937-b909-5f800abdb600" containerID="5df41709079702c50acb60779330e91db01063d536dd6d33a7f4ae625ec12bfb" exitCode=0 Mar 08 00:39:11.152052 master-0 kubenswrapper[23041]: I0308 00:39:11.152057 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"53c386ff-5ff0-4937-b909-5f800abdb600","Type":"ContainerDied","Data":"5df41709079702c50acb60779330e91db01063d536dd6d33a7f4ae625ec12bfb"} Mar 08 00:39:11.153562 master-0 kubenswrapper[23041]: I0308 00:39:11.153516 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:11.154422 master-0 kubenswrapper[23041]: I0308 00:39:11.154384 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"e117c5d376f39196447b92f6a93d68f1523d06b0ca339f10b530b14fcff3ceda"} Mar 08 00:39:11.154611 master-0 kubenswrapper[23041]: I0308 00:39:11.154536 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:11.158414 master-0 kubenswrapper[23041]: I0308 00:39:11.158357 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 00:39:11.159228 master-0 kubenswrapper[23041]: I0308 00:39:11.159172 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc" exitCode=0 Mar 08 00:39:11.159424 master-0 kubenswrapper[23041]: I0308 00:39:11.159396 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924" exitCode=0 Mar 08 00:39:11.159598 master-0 kubenswrapper[23041]: I0308 00:39:11.159573 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac" exitCode=0 Mar 08 00:39:11.159744 master-0 kubenswrapper[23041]: I0308 00:39:11.159721 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c" exitCode=2 Mar 08 00:39:12.171587 master-0 kubenswrapper[23041]: I0308 00:39:12.171514 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03"} Mar 08 00:39:12.173502 master-0 kubenswrapper[23041]: I0308 00:39:12.172972 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:12.173781 master-0 kubenswrapper[23041]: I0308 00:39:12.173659 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:12.625156 master-0 kubenswrapper[23041]: I0308 00:39:12.625104 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:39:12.626702 master-0 kubenswrapper[23041]: I0308 00:39:12.626598 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:12.627824 master-0 kubenswrapper[23041]: I0308 00:39:12.627765 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.689299 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir\") pod \"53c386ff-5ff0-4937-b909-5f800abdb600\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.689404 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access\") pod \"53c386ff-5ff0-4937-b909-5f800abdb600\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.689458 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock\") pod \"53c386ff-5ff0-4937-b909-5f800abdb600\" (UID: \"53c386ff-5ff0-4937-b909-5f800abdb600\") " Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.689629 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53c386ff-5ff0-4937-b909-5f800abdb600" (UID: "53c386ff-5ff0-4937-b909-5f800abdb600"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.689936 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:12.691686 master-0 kubenswrapper[23041]: I0308 00:39:12.690045 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock" (OuterVolumeSpecName: "var-lock") pod "53c386ff-5ff0-4937-b909-5f800abdb600" (UID: "53c386ff-5ff0-4937-b909-5f800abdb600"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:12.709133 master-0 kubenswrapper[23041]: I0308 00:39:12.707970 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53c386ff-5ff0-4937-b909-5f800abdb600" (UID: "53c386ff-5ff0-4937-b909-5f800abdb600"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:39:12.792789 master-0 kubenswrapper[23041]: I0308 00:39:12.792726 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c386ff-5ff0-4937-b909-5f800abdb600-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:12.793121 master-0 kubenswrapper[23041]: I0308 00:39:12.792811 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/53c386ff-5ff0-4937-b909-5f800abdb600-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:12.973641 master-0 kubenswrapper[23041]: E0308 00:39:12.973571 23041 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:39:13.015950 master-0 kubenswrapper[23041]: I0308 00:39:13.015897 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 00:39:13.017156 master-0 kubenswrapper[23041]: I0308 00:39:13.017128 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:13.018218 master-0 kubenswrapper[23041]: I0308 00:39:13.018159 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.020095 master-0 kubenswrapper[23041]: I0308 00:39:13.020069 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.020706 master-0 kubenswrapper[23041]: I0308 00:39:13.020661 23041 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.096926 master-0 kubenswrapper[23041]: I0308 00:39:13.096735 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 00:39:13.097382 master-0 kubenswrapper[23041]: I0308 00:39:13.096860 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:13.097766 master-0 kubenswrapper[23041]: I0308 00:39:13.097738 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 00:39:13.097957 master-0 kubenswrapper[23041]: I0308 00:39:13.097934 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 00:39:13.098152 master-0 kubenswrapper[23041]: I0308 00:39:13.097823 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:13.098152 master-0 kubenswrapper[23041]: I0308 00:39:13.097963 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:13.098866 master-0 kubenswrapper[23041]: I0308 00:39:13.098838 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:13.098994 master-0 kubenswrapper[23041]: I0308 00:39:13.098976 23041 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:13.099104 master-0 kubenswrapper[23041]: I0308 00:39:13.099088 23041 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:13.181838 master-0 kubenswrapper[23041]: I0308 00:39:13.181784 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"53c386ff-5ff0-4937-b909-5f800abdb600","Type":"ContainerDied","Data":"7893efb17bac0a7e0aafa5d282c528c161bb44feaebdb985d473fcd2ef95b3cf"} Mar 08 00:39:13.181838 master-0 kubenswrapper[23041]: I0308 00:39:13.181839 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7893efb17bac0a7e0aafa5d282c528c161bb44feaebdb985d473fcd2ef95b3cf" Mar 08 00:39:13.182472 master-0 kubenswrapper[23041]: I0308 00:39:13.182349 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 00:39:13.186355 master-0 kubenswrapper[23041]: I0308 00:39:13.186310 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.186788 master-0 kubenswrapper[23041]: I0308 00:39:13.186756 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 00:39:13.186846 master-0 kubenswrapper[23041]: I0308 00:39:13.186803 23041 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.187284 master-0 kubenswrapper[23041]: I0308 00:39:13.187250 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.187845 master-0 kubenswrapper[23041]: I0308 00:39:13.187784 23041 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15" exitCode=0 Mar 08 00:39:13.188351 master-0 kubenswrapper[23041]: I0308 00:39:13.188320 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:13.189429 master-0 kubenswrapper[23041]: I0308 00:39:13.189363 23041 scope.go:117] "RemoveContainer" containerID="20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc" Mar 08 00:39:13.201971 master-0 kubenswrapper[23041]: I0308 00:39:13.201901 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.202580 master-0 kubenswrapper[23041]: I0308 00:39:13.202496 23041 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.203136 master-0 kubenswrapper[23041]: I0308 00:39:13.203093 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.212129 master-0 kubenswrapper[23041]: I0308 00:39:13.212112 23041 scope.go:117] "RemoveContainer" containerID="e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924" Mar 08 00:39:13.238254 master-0 kubenswrapper[23041]: I0308 00:39:13.238180 23041 scope.go:117] "RemoveContainer" containerID="6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac" Mar 08 00:39:13.273544 master-0 kubenswrapper[23041]: I0308 00:39:13.273499 23041 scope.go:117] "RemoveContainer" containerID="f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c" Mar 08 00:39:13.274015 master-0 kubenswrapper[23041]: E0308 00:39:13.273973 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.274695 master-0 kubenswrapper[23041]: E0308 00:39:13.274668 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.275657 master-0 kubenswrapper[23041]: E0308 00:39:13.275549 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.276710 master-0 kubenswrapper[23041]: E0308 00:39:13.276583 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.277547 master-0 kubenswrapper[23041]: E0308 00:39:13.277505 23041 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:13.277633 master-0 kubenswrapper[23041]: I0308 00:39:13.277534 23041 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 00:39:13.278143 master-0 kubenswrapper[23041]: E0308 00:39:13.278094 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 00:39:13.293035 master-0 kubenswrapper[23041]: I0308 00:39:13.293006 23041 scope.go:117] "RemoveContainer" containerID="b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15" Mar 08 00:39:13.310383 master-0 kubenswrapper[23041]: I0308 00:39:13.310129 23041 scope.go:117] "RemoveContainer" containerID="f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58" Mar 08 00:39:13.325376 master-0 kubenswrapper[23041]: I0308 00:39:13.325328 23041 scope.go:117] "RemoveContainer" containerID="20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc" Mar 08 00:39:13.325794 master-0 kubenswrapper[23041]: E0308 00:39:13.325746 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc\": container with ID starting with 20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc not found: ID does not exist" containerID="20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc" Mar 08 00:39:13.325872 master-0 kubenswrapper[23041]: I0308 00:39:13.325793 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc"} err="failed to get container status \"20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc\": rpc error: code = NotFound desc = could not find container \"20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc\": container with ID starting with 20d63cb89e6de090d808330bf46fb0e0be192834ba95d40ddc9444894194c2fc not found: ID does not exist" Mar 08 00:39:13.325872 master-0 kubenswrapper[23041]: I0308 00:39:13.325818 23041 scope.go:117] "RemoveContainer" containerID="e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924" Mar 08 00:39:13.326435 master-0 kubenswrapper[23041]: E0308 00:39:13.326395 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924\": container with ID starting with e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924 not found: ID does not exist" containerID="e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924" Mar 08 00:39:13.326435 master-0 kubenswrapper[23041]: I0308 00:39:13.326424 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924"} err="failed to get container status \"e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924\": rpc error: code = NotFound desc = could not find container \"e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924\": container with ID starting with e7984e9dd6f4f9e4be878ed8775f1cba364ff5628bee5337e37a1ab208526924 not found: ID does not exist" Mar 08 00:39:13.326602 master-0 kubenswrapper[23041]: I0308 00:39:13.326481 23041 scope.go:117] "RemoveContainer" containerID="6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac" Mar 08 00:39:13.326923 master-0 kubenswrapper[23041]: E0308 00:39:13.326855 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac\": container with ID starting with 6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac not found: ID does not exist" containerID="6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac" Mar 08 00:39:13.326997 master-0 kubenswrapper[23041]: I0308 00:39:13.326927 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac"} err="failed to get container status \"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac\": rpc error: code = NotFound desc = could not find container \"6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac\": container with ID starting with 6bc55a348461d3cbf163ebf709ddec0e4c002365488c110e26f97e8640a91aac not found: ID does not exist" Mar 08 00:39:13.326997 master-0 kubenswrapper[23041]: I0308 00:39:13.326978 23041 scope.go:117] "RemoveContainer" containerID="f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c" Mar 08 00:39:13.327428 master-0 kubenswrapper[23041]: E0308 00:39:13.327382 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c\": container with ID starting with f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c not found: ID does not exist" containerID="f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c" Mar 08 00:39:13.327428 master-0 kubenswrapper[23041]: I0308 00:39:13.327415 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c"} err="failed to get container status \"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c\": rpc error: code = NotFound desc = could not find container \"f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c\": container with ID starting with f5a01a96f572cf6cdc2165118b1618cfc34c74c159113a86d01ad4567971ce7c not found: ID does not exist" Mar 08 00:39:13.327548 master-0 kubenswrapper[23041]: I0308 00:39:13.327439 23041 scope.go:117] "RemoveContainer" containerID="b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15" Mar 08 00:39:13.327832 master-0 kubenswrapper[23041]: E0308 00:39:13.327731 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15\": container with ID starting with b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15 not found: ID does not exist" containerID="b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15" Mar 08 00:39:13.327832 master-0 kubenswrapper[23041]: I0308 00:39:13.327770 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15"} err="failed to get container status \"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15\": rpc error: code = NotFound desc = could not find container \"b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15\": container with ID starting with b772417e1e99d8ea0e7f16b30732d2d8fa0d59084df9326d11ee8f293502bf15 not found: ID does not exist" Mar 08 00:39:13.327832 master-0 kubenswrapper[23041]: I0308 00:39:13.327795 23041 scope.go:117] "RemoveContainer" containerID="f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58" Mar 08 00:39:13.328122 master-0 kubenswrapper[23041]: E0308 00:39:13.328091 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58\": container with ID starting with f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58 not found: ID does not exist" containerID="f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58" Mar 08 00:39:13.328182 master-0 kubenswrapper[23041]: I0308 00:39:13.328126 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58"} err="failed to get container status \"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58\": rpc error: code = NotFound desc = could not find container \"f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58\": container with ID starting with f73d55f2e8434f88a6be502a595c0bcf07e53cfb094b52a7ac92890beaa91d58 not found: ID does not exist" Mar 08 00:39:13.480377 master-0 kubenswrapper[23041]: E0308 00:39:13.480093 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 00:39:13.881523 master-0 kubenswrapper[23041]: E0308 00:39:13.881449 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 00:39:14.482117 master-0 kubenswrapper[23041]: E0308 00:39:14.481951 23041 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189ab6c2c3665fdc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,LastTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:39:14.683569 master-0 kubenswrapper[23041]: E0308 00:39:14.682862 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 00:39:14.817514 master-0 kubenswrapper[23041]: I0308 00:39:14.817386 23041 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:14.818379 master-0 kubenswrapper[23041]: I0308 00:39:14.818304 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:14.819227 master-0 kubenswrapper[23041]: I0308 00:39:14.819108 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:14.825134 master-0 kubenswrapper[23041]: I0308 00:39:14.825052 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 08 00:39:15.052158 master-0 kubenswrapper[23041]: I0308 00:39:15.052047 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.052158 master-0 kubenswrapper[23041]: I0308 00:39:15.052108 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.052709 master-0 kubenswrapper[23041]: I0308 00:39:15.052601 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.052709 master-0 kubenswrapper[23041]: I0308 00:39:15.052659 23041 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 08 00:39:15.052709 master-0 kubenswrapper[23041]: I0308 00:39:15.052705 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.053077 master-0 kubenswrapper[23041]: I0308 00:39:15.052745 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="36794fe98525730e06c774f84687b7f3" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 08 00:39:15.059343 master-0 kubenswrapper[23041]: I0308 00:39:15.059292 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.061237 master-0 kubenswrapper[23041]: I0308 00:39:15.061093 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:15.062287 master-0 kubenswrapper[23041]: I0308 00:39:15.062184 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:15.064670 master-0 kubenswrapper[23041]: I0308 00:39:15.064596 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:15.213271 master-0 kubenswrapper[23041]: I0308 00:39:15.213074 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:15.215261 master-0 kubenswrapper[23041]: I0308 00:39:15.215115 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:15.216783 master-0 kubenswrapper[23041]: I0308 00:39:15.216687 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:15.218226 master-0 kubenswrapper[23041]: I0308 00:39:15.218155 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:16.284637 master-0 kubenswrapper[23041]: E0308 00:39:16.284516 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 00:39:16.807868 master-0 kubenswrapper[23041]: I0308 00:39:16.807821 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:16.809099 master-0 kubenswrapper[23041]: I0308 00:39:16.808992 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:16.810331 master-0 kubenswrapper[23041]: I0308 00:39:16.810163 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:16.814482 master-0 kubenswrapper[23041]: I0308 00:39:16.814358 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:16.839679 master-0 kubenswrapper[23041]: I0308 00:39:16.839566 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:16.839899 master-0 kubenswrapper[23041]: I0308 00:39:16.839690 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:16.841172 master-0 kubenswrapper[23041]: E0308 00:39:16.841120 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:16.841562 master-0 kubenswrapper[23041]: I0308 00:39:16.841540 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:16.880693 master-0 kubenswrapper[23041]: W0308 00:39:16.880647 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6a75ab47c06be4e74d05f552da4470.slice/crio-4fd8c0bdb9289891879ca97c8d72947b4bec57abe8439781947c4269eff8bc00 WatchSource:0}: Error finding container 4fd8c0bdb9289891879ca97c8d72947b4bec57abe8439781947c4269eff8bc00: Status 404 returned error can't find the container with id 4fd8c0bdb9289891879ca97c8d72947b4bec57abe8439781947c4269eff8bc00 Mar 08 00:39:17.172888 master-0 kubenswrapper[23041]: I0308 00:39:17.172717 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:17.172888 master-0 kubenswrapper[23041]: I0308 00:39:17.172815 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:17.226620 master-0 kubenswrapper[23041]: I0308 00:39:17.226551 23041 generic.go:334] "Generic (PLEG): container finished" podID="aa6a75ab47c06be4e74d05f552da4470" containerID="eead6108212c95a9cef33ddb824829283ef37b6d54f0e8b8370e850bb99be9e1" exitCode=0 Mar 08 00:39:17.228507 master-0 kubenswrapper[23041]: I0308 00:39:17.228442 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerDied","Data":"eead6108212c95a9cef33ddb824829283ef37b6d54f0e8b8370e850bb99be9e1"} Mar 08 00:39:17.228638 master-0 kubenswrapper[23041]: I0308 00:39:17.228620 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"4fd8c0bdb9289891879ca97c8d72947b4bec57abe8439781947c4269eff8bc00"} Mar 08 00:39:17.229021 master-0 kubenswrapper[23041]: I0308 00:39:17.229004 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:17.229158 master-0 kubenswrapper[23041]: I0308 00:39:17.229143 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:17.230172 master-0 kubenswrapper[23041]: I0308 00:39:17.230123 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:17.230314 master-0 kubenswrapper[23041]: E0308 00:39:17.230157 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:17.231218 master-0 kubenswrapper[23041]: I0308 00:39:17.230999 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:17.231659 master-0 kubenswrapper[23041]: I0308 00:39:17.231624 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:18.237073 master-0 kubenswrapper[23041]: I0308 00:39:18.236921 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"49321dae68248a90140dcd4f07e44562e5b4db44455220da7ad5f43c6aee577d"} Mar 08 00:39:18.237073 master-0 kubenswrapper[23041]: I0308 00:39:18.236982 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"380503fd587956d881780c5e3c55b4b20a2d116b689282809e39f247ca9d9838"} Mar 08 00:39:18.237073 master-0 kubenswrapper[23041]: I0308 00:39:18.236992 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"7dc2f4a6281f86a5201b8fd72041698ad96d876d48752ea224039fe894db864c"} Mar 08 00:39:18.237705 master-0 kubenswrapper[23041]: I0308 00:39:18.237171 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:18.237705 master-0 kubenswrapper[23041]: I0308 00:39:18.237308 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:18.237705 master-0 kubenswrapper[23041]: I0308 00:39:18.237325 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:18.238189 master-0 kubenswrapper[23041]: E0308 00:39:18.238149 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:18.238333 master-0 kubenswrapper[23041]: I0308 00:39:18.238286 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:18.239016 master-0 kubenswrapper[23041]: I0308 00:39:18.238941 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:18.239727 master-0 kubenswrapper[23041]: I0308 00:39:18.239674 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:19.245972 master-0 kubenswrapper[23041]: I0308 00:39:19.245896 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:19.245972 master-0 kubenswrapper[23041]: I0308 00:39:19.245951 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:39:19.246927 master-0 kubenswrapper[23041]: E0308 00:39:19.246858 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:39:19.434935 master-0 kubenswrapper[23041]: I0308 00:39:19.434867 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:19.434935 master-0 kubenswrapper[23041]: I0308 00:39:19.434928 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:19.486121 master-0 kubenswrapper[23041]: E0308 00:39:19.486039 23041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 00:39:24.484767 master-0 kubenswrapper[23041]: E0308 00:39:24.484549 23041 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189ab6c2c3665fdc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,LastTimestamp:2026-03-08 00:39:10.941888476 +0000 UTC m=+456.414725030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 00:39:24.807999 master-0 kubenswrapper[23041]: I0308 00:39:24.807928 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:24.821190 master-0 kubenswrapper[23041]: I0308 00:39:24.821025 23041 status_manager.go:851] "Failed to get status for pod" podUID="aa6a75ab47c06be4e74d05f552da4470" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.822710 master-0 kubenswrapper[23041]: I0308 00:39:24.822629 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.823809 master-0 kubenswrapper[23041]: I0308 00:39:24.823743 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.824993 master-0 kubenswrapper[23041]: I0308 00:39:24.824931 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.825906 master-0 kubenswrapper[23041]: I0308 00:39:24.825844 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.826923 master-0 kubenswrapper[23041]: I0308 00:39:24.826856 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.828894 master-0 kubenswrapper[23041]: I0308 00:39:24.828829 23041 status_manager.go:851] "Failed to get status for pod" podUID="aa6a75ab47c06be4e74d05f552da4470" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.830416 master-0 kubenswrapper[23041]: I0308 00:39:24.830338 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:24.840598 master-0 kubenswrapper[23041]: I0308 00:39:24.840512 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:24.840598 master-0 kubenswrapper[23041]: I0308 00:39:24.840589 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:24.841766 master-0 kubenswrapper[23041]: E0308 00:39:24.841680 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:24.842563 master-0 kubenswrapper[23041]: I0308 00:39:24.842542 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:24.881310 master-0 kubenswrapper[23041]: W0308 00:39:24.881120 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d4251d3504cdc0ec85144c1379056c.slice/crio-1dabafd08be6158b3ded27722cb43506fbfc907b8b2c7a5290097dbd34cef377 WatchSource:0}: Error finding container 1dabafd08be6158b3ded27722cb43506fbfc907b8b2c7a5290097dbd34cef377: Status 404 returned error can't find the container with id 1dabafd08be6158b3ded27722cb43506fbfc907b8b2c7a5290097dbd34cef377 Mar 08 00:39:25.322858 master-0 kubenswrapper[23041]: I0308 00:39:25.322721 23041 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="8521f334a23a2cdd8301c55d14c8fc4723415c56abbf9d4e805672ffebefb73e" exitCode=0 Mar 08 00:39:25.322858 master-0 kubenswrapper[23041]: I0308 00:39:25.322782 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"8521f334a23a2cdd8301c55d14c8fc4723415c56abbf9d4e805672ffebefb73e"} Mar 08 00:39:25.322858 master-0 kubenswrapper[23041]: I0308 00:39:25.322817 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"1dabafd08be6158b3ded27722cb43506fbfc907b8b2c7a5290097dbd34cef377"} Mar 08 00:39:25.323174 master-0 kubenswrapper[23041]: I0308 00:39:25.323112 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:25.323174 master-0 kubenswrapper[23041]: I0308 00:39:25.323130 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:25.324004 master-0 kubenswrapper[23041]: E0308 00:39:25.323944 23041 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:25.324179 master-0 kubenswrapper[23041]: I0308 00:39:25.324118 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.326068 master-0 kubenswrapper[23041]: I0308 00:39:25.325500 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.326493 master-0 kubenswrapper[23041]: I0308 00:39:25.326427 23041 status_manager.go:851] "Failed to get status for pod" podUID="aa6a75ab47c06be4e74d05f552da4470" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.327093 master-0 kubenswrapper[23041]: I0308 00:39:25.327023 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.351487 master-0 kubenswrapper[23041]: I0308 00:39:25.351419 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:25.354495 master-0 kubenswrapper[23041]: I0308 00:39:25.352821 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.354495 master-0 kubenswrapper[23041]: I0308 00:39:25.353751 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.355031 master-0 kubenswrapper[23041]: I0308 00:39:25.354632 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.355753 master-0 kubenswrapper[23041]: I0308 00:39:25.355595 23041 status_manager.go:851] "Failed to get status for pod" podUID="aa6a75ab47c06be4e74d05f552da4470" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.358572 master-0 kubenswrapper[23041]: I0308 00:39:25.358542 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 00:39:25.359849 master-0 kubenswrapper[23041]: I0308 00:39:25.359807 23041 status_manager.go:851] "Failed to get status for pod" podUID="aa6a75ab47c06be4e74d05f552da4470" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.361753 master-0 kubenswrapper[23041]: I0308 00:39:25.361682 23041 status_manager.go:851] "Failed to get status for pod" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.362833 master-0 kubenswrapper[23041]: I0308 00:39:25.362778 23041 status_manager.go:851] "Failed to get status for pod" podUID="a814bd60de133d95cf99630a978c017e" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:25.363615 master-0 kubenswrapper[23041]: I0308 00:39:25.363570 23041 status_manager.go:851] "Failed to get status for pod" podUID="36794fe98525730e06c774f84687b7f3" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 00:39:26.337227 master-0 kubenswrapper[23041]: I0308 00:39:26.337164 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"d2b7d95d6b571a27fcb87e1d10f7a914e92c04f55f020ca004f9f796c7a44718"} Mar 08 00:39:26.337227 master-0 kubenswrapper[23041]: I0308 00:39:26.337232 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"9c2a243ab9e1aab9abdb21a8470ba65dc98d45790dcd57ed00499924dc759932"} Mar 08 00:39:26.337775 master-0 kubenswrapper[23041]: I0308 00:39:26.337243 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"707f5a1f1740ae8fb6205e1ec2f2f14fb2d5c042a22da7753bca7c898122430e"} Mar 08 00:39:26.337775 master-0 kubenswrapper[23041]: I0308 00:39:26.337255 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"64de82458ab4af624c35f20cf869d7e8719f240d8523e695a34272ec0cbeb70c"} Mar 08 00:39:27.173075 master-0 kubenswrapper[23041]: I0308 00:39:27.172988 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:27.173339 master-0 kubenswrapper[23041]: I0308 00:39:27.173081 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:27.349058 master-0 kubenswrapper[23041]: I0308 00:39:27.348990 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"07f9f4746921f92d4ef22e6fde84c37292d7a2a9d643aba70a17c1eb289b8d17"} Mar 08 00:39:27.350479 master-0 kubenswrapper[23041]: I0308 00:39:27.349240 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:27.350479 master-0 kubenswrapper[23041]: I0308 00:39:27.349390 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:27.350479 master-0 kubenswrapper[23041]: I0308 00:39:27.349427 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:29.435061 master-0 kubenswrapper[23041]: I0308 00:39:29.434975 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:29.435794 master-0 kubenswrapper[23041]: I0308 00:39:29.435080 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:29.842876 master-0 kubenswrapper[23041]: I0308 00:39:29.842794 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:29.842876 master-0 kubenswrapper[23041]: I0308 00:39:29.842878 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:29.851797 master-0 kubenswrapper[23041]: I0308 00:39:29.851735 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:32.367855 master-0 kubenswrapper[23041]: I0308 00:39:32.367764 23041 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:33.394857 master-0 kubenswrapper[23041]: I0308 00:39:33.394808 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:33.395493 master-0 kubenswrapper[23041]: I0308 00:39:33.395476 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:33.398694 master-0 kubenswrapper[23041]: I0308 00:39:33.398642 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:34.400529 master-0 kubenswrapper[23041]: I0308 00:39:34.400436 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:34.400529 master-0 kubenswrapper[23041]: I0308 00:39:34.400475 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="73feff8f-02ce-48e8-b200-3fa2ef50bee3" Mar 08 00:39:34.852267 master-0 kubenswrapper[23041]: I0308 00:39:34.852079 23041 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="ab43a892-fcd8-48c1-8bed-b17b08bbece5" Mar 08 00:39:37.173247 master-0 kubenswrapper[23041]: I0308 00:39:37.173149 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:37.174164 master-0 kubenswrapper[23041]: I0308 00:39:37.174112 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:39.434095 master-0 kubenswrapper[23041]: I0308 00:39:39.434033 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:39.434902 master-0 kubenswrapper[23041]: I0308 00:39:39.434096 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:41.956829 master-0 kubenswrapper[23041]: I0308 00:39:41.956780 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 00:39:42.259829 master-0 kubenswrapper[23041]: I0308 00:39:42.259761 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:39:42.342599 master-0 kubenswrapper[23041]: I0308 00:39:42.342523 23041 scope.go:117] "RemoveContainer" containerID="ad59f0ee4ace09dae79cfc40c750720203b39cdfecc33e32dfaa1834966aad3c" Mar 08 00:39:42.558947 master-0 kubenswrapper[23041]: I0308 00:39:42.558825 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:39:42.614227 master-0 kubenswrapper[23041]: I0308 00:39:42.614137 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:39:42.636963 master-0 kubenswrapper[23041]: I0308 00:39:42.636905 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:39:43.378963 master-0 kubenswrapper[23041]: I0308 00:39:43.378848 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4tn2t" Mar 08 00:39:43.379825 master-0 kubenswrapper[23041]: I0308 00:39:43.379066 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:39:43.442734 master-0 kubenswrapper[23041]: I0308 00:39:43.442627 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 00:39:43.456814 master-0 kubenswrapper[23041]: I0308 00:39:43.456746 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bmgck" Mar 08 00:39:43.516733 master-0 kubenswrapper[23041]: I0308 00:39:43.516635 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:39:43.559998 master-0 kubenswrapper[23041]: I0308 00:39:43.559908 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 00:39:43.652880 master-0 kubenswrapper[23041]: I0308 00:39:43.652674 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:39:43.657176 master-0 kubenswrapper[23041]: I0308 00:39:43.657089 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:39:43.669703 master-0 kubenswrapper[23041]: I0308 00:39:43.669627 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:39:43.726737 master-0 kubenswrapper[23041]: I0308 00:39:43.726671 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:39:43.953420 master-0 kubenswrapper[23041]: I0308 00:39:43.953259 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-2mmjv" Mar 08 00:39:43.962263 master-0 kubenswrapper[23041]: I0308 00:39:43.962194 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 00:39:44.039053 master-0 kubenswrapper[23041]: I0308 00:39:44.039001 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 00:39:44.370535 master-0 kubenswrapper[23041]: I0308 00:39:44.370476 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:39:44.464378 master-0 kubenswrapper[23041]: I0308 00:39:44.464300 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:39:44.550659 master-0 kubenswrapper[23041]: I0308 00:39:44.550556 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:39:44.596933 master-0 kubenswrapper[23041]: I0308 00:39:44.596857 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:39:44.663763 master-0 kubenswrapper[23041]: I0308 00:39:44.663629 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:39:44.783589 master-0 kubenswrapper[23041]: I0308 00:39:44.783541 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:39:44.892218 master-0 kubenswrapper[23041]: I0308 00:39:44.892149 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:39:44.994296 master-0 kubenswrapper[23041]: I0308 00:39:44.994239 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 00:39:45.010164 master-0 kubenswrapper[23041]: I0308 00:39:45.010097 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:39:45.024170 master-0 kubenswrapper[23041]: I0308 00:39:45.024116 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:39:45.024382 master-0 kubenswrapper[23041]: I0308 00:39:45.024132 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:39:45.024462 master-0 kubenswrapper[23041]: I0308 00:39:45.024392 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:39:45.045793 master-0 kubenswrapper[23041]: I0308 00:39:45.044553 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-ppd6p" Mar 08 00:39:45.077191 master-0 kubenswrapper[23041]: I0308 00:39:45.077133 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:39:45.085257 master-0 kubenswrapper[23041]: I0308 00:39:45.083724 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 00:39:45.144174 master-0 kubenswrapper[23041]: I0308 00:39:45.144103 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-7jhj9" Mar 08 00:39:45.197614 master-0 kubenswrapper[23041]: I0308 00:39:45.197555 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:39:45.273245 master-0 kubenswrapper[23041]: I0308 00:39:45.273117 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 00:39:45.343830 master-0 kubenswrapper[23041]: I0308 00:39:45.343751 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:39:45.353116 master-0 kubenswrapper[23041]: I0308 00:39:45.353058 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:39:45.500592 master-0 kubenswrapper[23041]: I0308 00:39:45.500535 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:39:45.699775 master-0 kubenswrapper[23041]: I0308 00:39:45.698911 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 00:39:45.739930 master-0 kubenswrapper[23041]: I0308 00:39:45.739877 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 00:39:45.742528 master-0 kubenswrapper[23041]: I0308 00:39:45.742498 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:39:45.845092 master-0 kubenswrapper[23041]: I0308 00:39:45.845035 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-b4z2l" Mar 08 00:39:45.890394 master-0 kubenswrapper[23041]: I0308 00:39:45.890309 23041 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:39:45.975101 master-0 kubenswrapper[23041]: I0308 00:39:45.974954 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:39:45.989919 master-0 kubenswrapper[23041]: I0308 00:39:45.989875 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 00:39:46.101694 master-0 kubenswrapper[23041]: I0308 00:39:46.101629 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:39:46.102763 master-0 kubenswrapper[23041]: I0308 00:39:46.102690 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:39:46.145377 master-0 kubenswrapper[23041]: I0308 00:39:46.145304 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:39:46.183440 master-0 kubenswrapper[23041]: I0308 00:39:46.183364 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 00:39:46.211343 master-0 kubenswrapper[23041]: I0308 00:39:46.211276 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:39:46.303850 master-0 kubenswrapper[23041]: I0308 00:39:46.303748 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:39:46.417281 master-0 kubenswrapper[23041]: I0308 00:39:46.417165 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 00:39:46.424540 master-0 kubenswrapper[23041]: I0308 00:39:46.424486 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:39:46.468106 master-0 kubenswrapper[23041]: I0308 00:39:46.468048 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 00:39:46.482543 master-0 kubenswrapper[23041]: I0308 00:39:46.482443 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 00:39:46.539326 master-0 kubenswrapper[23041]: I0308 00:39:46.539275 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 00:39:46.624332 master-0 kubenswrapper[23041]: I0308 00:39:46.624187 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:39:46.668446 master-0 kubenswrapper[23041]: I0308 00:39:46.668401 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 00:39:46.672588 master-0 kubenswrapper[23041]: I0308 00:39:46.672548 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 00:39:46.718750 master-0 kubenswrapper[23041]: I0308 00:39:46.718696 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:39:46.767016 master-0 kubenswrapper[23041]: I0308 00:39:46.766964 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:39:46.774189 master-0 kubenswrapper[23041]: I0308 00:39:46.774160 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:39:46.786637 master-0 kubenswrapper[23041]: I0308 00:39:46.786548 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:39:46.807849 master-0 kubenswrapper[23041]: I0308 00:39:46.807800 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:39:46.856610 master-0 kubenswrapper[23041]: I0308 00:39:46.856513 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:39:46.899045 master-0 kubenswrapper[23041]: I0308 00:39:46.898634 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:39:47.071607 master-0 kubenswrapper[23041]: I0308 00:39:47.071552 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:39:47.092174 master-0 kubenswrapper[23041]: I0308 00:39:47.092137 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:39:47.116817 master-0 kubenswrapper[23041]: I0308 00:39:47.116780 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:39:47.120010 master-0 kubenswrapper[23041]: I0308 00:39:47.119990 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:39:47.127412 master-0 kubenswrapper[23041]: I0308 00:39:47.127387 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:39:47.173347 master-0 kubenswrapper[23041]: I0308 00:39:47.173217 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:47.173347 master-0 kubenswrapper[23041]: I0308 00:39:47.173280 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:47.198517 master-0 kubenswrapper[23041]: I0308 00:39:47.198469 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:39:47.312075 master-0 kubenswrapper[23041]: I0308 00:39:47.312011 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 00:39:47.331680 master-0 kubenswrapper[23041]: I0308 00:39:47.331620 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5qzcm" Mar 08 00:39:47.429998 master-0 kubenswrapper[23041]: I0308 00:39:47.429874 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:39:47.429998 master-0 kubenswrapper[23041]: I0308 00:39:47.429921 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-48mqvdnajl6js" Mar 08 00:39:47.589665 master-0 kubenswrapper[23041]: I0308 00:39:47.589607 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 08 00:39:47.658409 master-0 kubenswrapper[23041]: I0308 00:39:47.658329 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 00:39:47.663754 master-0 kubenswrapper[23041]: I0308 00:39:47.663716 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:39:47.674872 master-0 kubenswrapper[23041]: I0308 00:39:47.674824 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 00:39:47.707642 master-0 kubenswrapper[23041]: I0308 00:39:47.707452 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qlv59" Mar 08 00:39:47.738926 master-0 kubenswrapper[23041]: I0308 00:39:47.738858 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:39:47.788230 master-0 kubenswrapper[23041]: I0308 00:39:47.788159 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:39:47.834491 master-0 kubenswrapper[23041]: I0308 00:39:47.834445 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:39:47.839525 master-0 kubenswrapper[23041]: I0308 00:39:47.839495 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-8nl72" Mar 08 00:39:47.901540 master-0 kubenswrapper[23041]: I0308 00:39:47.901489 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-94fb4" Mar 08 00:39:47.932750 master-0 kubenswrapper[23041]: I0308 00:39:47.932698 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:39:47.936799 master-0 kubenswrapper[23041]: I0308 00:39:47.936749 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:39:47.952709 master-0 kubenswrapper[23041]: I0308 00:39:47.952634 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-qvcg8" Mar 08 00:39:47.969439 master-0 kubenswrapper[23041]: I0308 00:39:47.969266 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:39:48.003122 master-0 kubenswrapper[23041]: I0308 00:39:48.003034 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:39:48.040986 master-0 kubenswrapper[23041]: I0308 00:39:48.040912 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-2m7s0hn4nptd" Mar 08 00:39:48.051929 master-0 kubenswrapper[23041]: I0308 00:39:48.051863 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:39:48.092443 master-0 kubenswrapper[23041]: I0308 00:39:48.092354 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:39:48.111097 master-0 kubenswrapper[23041]: I0308 00:39:48.111027 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:39:48.217307 master-0 kubenswrapper[23041]: I0308 00:39:48.217197 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:39:48.219793 master-0 kubenswrapper[23041]: I0308 00:39:48.219700 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsc4j" Mar 08 00:39:48.248025 master-0 kubenswrapper[23041]: I0308 00:39:48.247946 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:39:48.287799 master-0 kubenswrapper[23041]: I0308 00:39:48.287711 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:39:48.288461 master-0 kubenswrapper[23041]: I0308 00:39:48.288416 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:39:48.305752 master-0 kubenswrapper[23041]: I0308 00:39:48.305163 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:39:48.307568 master-0 kubenswrapper[23041]: I0308 00:39:48.307284 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:39:48.326829 master-0 kubenswrapper[23041]: I0308 00:39:48.326743 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 00:39:48.337124 master-0 kubenswrapper[23041]: I0308 00:39:48.337056 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-7gfkc" Mar 08 00:39:48.350322 master-0 kubenswrapper[23041]: I0308 00:39:48.350263 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 00:39:48.399133 master-0 kubenswrapper[23041]: I0308 00:39:48.399085 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-v5zml" Mar 08 00:39:48.435137 master-0 kubenswrapper[23041]: I0308 00:39:48.435074 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:39:48.612777 master-0 kubenswrapper[23041]: I0308 00:39:48.612735 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:39:48.835542 master-0 kubenswrapper[23041]: I0308 00:39:48.835265 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 00:39:48.984158 master-0 kubenswrapper[23041]: I0308 00:39:48.984037 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-g5h9b" Mar 08 00:39:48.995147 master-0 kubenswrapper[23041]: I0308 00:39:48.995103 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 08 00:39:49.008998 master-0 kubenswrapper[23041]: I0308 00:39:49.008943 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:39:49.059811 master-0 kubenswrapper[23041]: I0308 00:39:49.059618 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 00:39:49.121844 master-0 kubenswrapper[23041]: I0308 00:39:49.121787 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:39:49.129520 master-0 kubenswrapper[23041]: I0308 00:39:49.129488 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:39:49.147401 master-0 kubenswrapper[23041]: I0308 00:39:49.147257 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 00:39:49.189017 master-0 kubenswrapper[23041]: I0308 00:39:49.188954 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:39:49.215137 master-0 kubenswrapper[23041]: I0308 00:39:49.215105 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bkprm" Mar 08 00:39:49.235670 master-0 kubenswrapper[23041]: I0308 00:39:49.235564 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:39:49.258753 master-0 kubenswrapper[23041]: I0308 00:39:49.258687 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:39:49.278318 master-0 kubenswrapper[23041]: I0308 00:39:49.278218 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:39:49.340038 master-0 kubenswrapper[23041]: I0308 00:39:49.339974 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:39:49.399331 master-0 kubenswrapper[23041]: I0308 00:39:49.399279 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:39:49.434540 master-0 kubenswrapper[23041]: I0308 00:39:49.434479 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:49.434778 master-0 kubenswrapper[23041]: I0308 00:39:49.434544 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:49.461337 master-0 kubenswrapper[23041]: I0308 00:39:49.461263 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 00:39:49.487157 master-0 kubenswrapper[23041]: I0308 00:39:49.487014 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 00:39:49.599176 master-0 kubenswrapper[23041]: I0308 00:39:49.599120 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-cm5m6" Mar 08 00:39:49.629480 master-0 kubenswrapper[23041]: I0308 00:39:49.629411 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 00:39:49.668651 master-0 kubenswrapper[23041]: I0308 00:39:49.668597 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:39:49.729162 master-0 kubenswrapper[23041]: I0308 00:39:49.729097 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-fsd5q" Mar 08 00:39:49.780608 master-0 kubenswrapper[23041]: I0308 00:39:49.780553 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:39:49.826012 master-0 kubenswrapper[23041]: I0308 00:39:49.825967 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:39:49.984542 master-0 kubenswrapper[23041]: I0308 00:39:49.984491 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:39:50.004263 master-0 kubenswrapper[23041]: I0308 00:39:50.004194 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:39:50.009558 master-0 kubenswrapper[23041]: I0308 00:39:50.009507 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:39:50.026501 master-0 kubenswrapper[23041]: I0308 00:39:50.026449 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:39:50.038626 master-0 kubenswrapper[23041]: I0308 00:39:50.038522 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:39:50.041083 master-0 kubenswrapper[23041]: I0308 00:39:50.041058 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-8z76k" Mar 08 00:39:50.045821 master-0 kubenswrapper[23041]: I0308 00:39:50.045770 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 00:39:50.067870 master-0 kubenswrapper[23041]: I0308 00:39:50.067828 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:39:50.073334 master-0 kubenswrapper[23041]: I0308 00:39:50.073299 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 00:39:50.081017 master-0 kubenswrapper[23041]: I0308 00:39:50.080990 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 00:39:50.097815 master-0 kubenswrapper[23041]: I0308 00:39:50.097771 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 00:39:50.104684 master-0 kubenswrapper[23041]: I0308 00:39:50.104655 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:39:50.108343 master-0 kubenswrapper[23041]: I0308 00:39:50.108296 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:39:50.123176 master-0 kubenswrapper[23041]: I0308 00:39:50.123120 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:39:50.232368 master-0 kubenswrapper[23041]: I0308 00:39:50.232334 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:39:50.241696 master-0 kubenswrapper[23041]: I0308 00:39:50.241647 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:39:50.285429 master-0 kubenswrapper[23041]: I0308 00:39:50.285387 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:39:50.327254 master-0 kubenswrapper[23041]: I0308 00:39:50.327120 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-nppj6" Mar 08 00:39:50.392229 master-0 kubenswrapper[23041]: I0308 00:39:50.392186 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:39:50.412025 master-0 kubenswrapper[23041]: I0308 00:39:50.411977 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:39:50.531222 master-0 kubenswrapper[23041]: I0308 00:39:50.531156 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:39:50.630755 master-0 kubenswrapper[23041]: I0308 00:39:50.630652 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:39:50.655513 master-0 kubenswrapper[23041]: I0308 00:39:50.655470 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:39:50.686453 master-0 kubenswrapper[23041]: I0308 00:39:50.686390 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 00:39:50.720064 master-0 kubenswrapper[23041]: I0308 00:39:50.720014 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 00:39:50.777331 master-0 kubenswrapper[23041]: I0308 00:39:50.777287 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:39:50.781925 master-0 kubenswrapper[23041]: I0308 00:39:50.781894 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:39:50.858504 master-0 kubenswrapper[23041]: I0308 00:39:50.858046 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:39:50.861542 master-0 kubenswrapper[23041]: I0308 00:39:50.861505 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:39:50.883341 master-0 kubenswrapper[23041]: I0308 00:39:50.883194 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:39:50.925218 master-0 kubenswrapper[23041]: I0308 00:39:50.925141 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:39:50.940447 master-0 kubenswrapper[23041]: I0308 00:39:50.940402 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:39:50.977590 master-0 kubenswrapper[23041]: I0308 00:39:50.977532 23041 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:39:50.984047 master-0 kubenswrapper[23041]: I0308 00:39:50.983973 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=40.983953065 podStartE2EDuration="40.983953065s" podCreationTimestamp="2026-03-08 00:39:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:39:32.439581216 +0000 UTC m=+477.912417780" watchObservedRunningTime="2026-03-08 00:39:50.983953065 +0000 UTC m=+496.456789639" Mar 08 00:39:50.986427 master-0 kubenswrapper[23041]: I0308 00:39:50.986136 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:39:50.986427 master-0 kubenswrapper[23041]: I0308 00:39:50.986189 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 00:39:50.991276 master-0 kubenswrapper[23041]: I0308 00:39:50.990527 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 00:39:51.001775 master-0 kubenswrapper[23041]: I0308 00:39:51.001658 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:39:51.012535 master-0 kubenswrapper[23041]: I0308 00:39:51.012378 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=19.012349356 podStartE2EDuration="19.012349356s" podCreationTimestamp="2026-03-08 00:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:39:51.009079006 +0000 UTC m=+496.481915570" watchObservedRunningTime="2026-03-08 00:39:51.012349356 +0000 UTC m=+496.485185920" Mar 08 00:39:51.039203 master-0 kubenswrapper[23041]: I0308 00:39:51.039165 23041 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:39:51.166606 master-0 kubenswrapper[23041]: I0308 00:39:51.166355 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 00:39:51.225732 master-0 kubenswrapper[23041]: I0308 00:39:51.225622 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:39:51.241345 master-0 kubenswrapper[23041]: I0308 00:39:51.241313 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:39:51.335481 master-0 kubenswrapper[23041]: I0308 00:39:51.335435 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-hgdt2" Mar 08 00:39:51.350272 master-0 kubenswrapper[23041]: I0308 00:39:51.350214 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:39:51.399829 master-0 kubenswrapper[23041]: I0308 00:39:51.399767 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:39:51.448634 master-0 kubenswrapper[23041]: I0308 00:39:51.448488 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 00:39:51.455265 master-0 kubenswrapper[23041]: I0308 00:39:51.455233 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5fe8510kelpgf" Mar 08 00:39:51.539943 master-0 kubenswrapper[23041]: I0308 00:39:51.539872 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:39:51.578157 master-0 kubenswrapper[23041]: I0308 00:39:51.578111 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-68jqh" Mar 08 00:39:51.599524 master-0 kubenswrapper[23041]: I0308 00:39:51.599453 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:39:51.683316 master-0 kubenswrapper[23041]: I0308 00:39:51.683268 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:39:51.683843 master-0 kubenswrapper[23041]: I0308 00:39:51.683519 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:39:51.691298 master-0 kubenswrapper[23041]: I0308 00:39:51.690997 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:39:51.694862 master-0 kubenswrapper[23041]: I0308 00:39:51.694833 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-fp767" Mar 08 00:39:51.710904 master-0 kubenswrapper[23041]: I0308 00:39:51.710655 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:39:51.723683 master-0 kubenswrapper[23041]: I0308 00:39:51.723641 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:39:51.732914 master-0 kubenswrapper[23041]: I0308 00:39:51.732876 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-gqbjw" Mar 08 00:39:51.770014 master-0 kubenswrapper[23041]: I0308 00:39:51.769952 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:39:51.770014 master-0 kubenswrapper[23041]: I0308 00:39:51.770007 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-94mhc" Mar 08 00:39:51.772665 master-0 kubenswrapper[23041]: I0308 00:39:51.772610 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 00:39:51.787247 master-0 kubenswrapper[23041]: I0308 00:39:51.787187 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 00:39:51.791738 master-0 kubenswrapper[23041]: I0308 00:39:51.791700 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 00:39:51.806949 master-0 kubenswrapper[23041]: I0308 00:39:51.806878 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 00:39:51.811780 master-0 kubenswrapper[23041]: I0308 00:39:51.811731 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 00:39:51.829927 master-0 kubenswrapper[23041]: I0308 00:39:51.829873 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:39:51.939411 master-0 kubenswrapper[23041]: I0308 00:39:51.939328 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-jt6pk" Mar 08 00:39:52.014441 master-0 kubenswrapper[23041]: I0308 00:39:52.014370 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 00:39:52.052228 master-0 kubenswrapper[23041]: I0308 00:39:52.051904 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:39:52.054244 master-0 kubenswrapper[23041]: I0308 00:39:52.054228 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 00:39:52.115308 master-0 kubenswrapper[23041]: I0308 00:39:52.115266 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:39:52.202718 master-0 kubenswrapper[23041]: I0308 00:39:52.202678 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 00:39:52.228508 master-0 kubenswrapper[23041]: I0308 00:39:52.228452 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:39:52.237635 master-0 kubenswrapper[23041]: I0308 00:39:52.237562 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:39:52.247619 master-0 kubenswrapper[23041]: I0308 00:39:52.247557 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 00:39:52.292127 master-0 kubenswrapper[23041]: I0308 00:39:52.291999 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 00:39:52.363195 master-0 kubenswrapper[23041]: I0308 00:39:52.363150 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:39:52.396649 master-0 kubenswrapper[23041]: I0308 00:39:52.396577 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:39:52.410343 master-0 kubenswrapper[23041]: I0308 00:39:52.410262 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-l6qr9" Mar 08 00:39:52.435492 master-0 kubenswrapper[23041]: I0308 00:39:52.435402 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:39:52.473348 master-0 kubenswrapper[23041]: I0308 00:39:52.473260 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:39:52.475963 master-0 kubenswrapper[23041]: I0308 00:39:52.475905 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:39:52.483521 master-0 kubenswrapper[23041]: I0308 00:39:52.483476 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:39:52.588033 master-0 kubenswrapper[23041]: I0308 00:39:52.587913 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:39:52.593709 master-0 kubenswrapper[23041]: I0308 00:39:52.593678 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:39:52.686752 master-0 kubenswrapper[23041]: I0308 00:39:52.686644 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 00:39:52.696025 master-0 kubenswrapper[23041]: I0308 00:39:52.695973 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:39:52.701638 master-0 kubenswrapper[23041]: I0308 00:39:52.701581 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 00:39:52.712888 master-0 kubenswrapper[23041]: I0308 00:39:52.712853 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 08 00:39:52.727657 master-0 kubenswrapper[23041]: I0308 00:39:52.727589 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:39:52.746211 master-0 kubenswrapper[23041]: I0308 00:39:52.746140 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:39:52.827861 master-0 kubenswrapper[23041]: I0308 00:39:52.827797 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:39:52.910322 master-0 kubenswrapper[23041]: I0308 00:39:52.910110 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:39:52.926463 master-0 kubenswrapper[23041]: I0308 00:39:52.926380 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:39:52.969520 master-0 kubenswrapper[23041]: I0308 00:39:52.969454 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 00:39:53.037384 master-0 kubenswrapper[23041]: I0308 00:39:53.037314 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 00:39:53.044349 master-0 kubenswrapper[23041]: I0308 00:39:53.044318 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 00:39:53.103046 master-0 kubenswrapper[23041]: I0308 00:39:53.102988 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-49cm5" Mar 08 00:39:53.170683 master-0 kubenswrapper[23041]: I0308 00:39:53.170532 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-867kk" Mar 08 00:39:53.194629 master-0 kubenswrapper[23041]: I0308 00:39:53.194559 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:39:53.212279 master-0 kubenswrapper[23041]: I0308 00:39:53.212174 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:39:53.214157 master-0 kubenswrapper[23041]: I0308 00:39:53.214093 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:39:53.214402 master-0 kubenswrapper[23041]: I0308 00:39:53.214358 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 00:39:53.253513 master-0 kubenswrapper[23041]: I0308 00:39:53.253418 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:39:53.283844 master-0 kubenswrapper[23041]: I0308 00:39:53.283777 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 08 00:39:53.328447 master-0 kubenswrapper[23041]: I0308 00:39:53.328375 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 00:39:53.405028 master-0 kubenswrapper[23041]: I0308 00:39:53.404973 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:39:53.489908 master-0 kubenswrapper[23041]: I0308 00:39:53.489784 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 00:39:53.493615 master-0 kubenswrapper[23041]: I0308 00:39:53.493589 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:39:53.552136 master-0 kubenswrapper[23041]: I0308 00:39:53.552096 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:39:53.562499 master-0 kubenswrapper[23041]: I0308 00:39:53.562411 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 00:39:53.563439 master-0 kubenswrapper[23041]: I0308 00:39:53.563389 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-7cd6d" Mar 08 00:39:53.582932 master-0 kubenswrapper[23041]: I0308 00:39:53.582855 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 00:39:53.613231 master-0 kubenswrapper[23041]: I0308 00:39:53.613135 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:39:53.678584 master-0 kubenswrapper[23041]: I0308 00:39:53.678490 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 00:39:53.752113 master-0 kubenswrapper[23041]: I0308 00:39:53.752067 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 00:39:53.753953 master-0 kubenswrapper[23041]: I0308 00:39:53.753919 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:39:53.794495 master-0 kubenswrapper[23041]: I0308 00:39:53.794454 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 00:39:53.899955 master-0 kubenswrapper[23041]: I0308 00:39:53.899903 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 00:39:54.003645 master-0 kubenswrapper[23041]: I0308 00:39:54.003529 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:39:54.068422 master-0 kubenswrapper[23041]: I0308 00:39:54.068341 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:39:54.149716 master-0 kubenswrapper[23041]: I0308 00:39:54.149615 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 00:39:54.189579 master-0 kubenswrapper[23041]: I0308 00:39:54.189511 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-ldbrl" Mar 08 00:39:54.209748 master-0 kubenswrapper[23041]: I0308 00:39:54.209679 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 08 00:39:54.213919 master-0 kubenswrapper[23041]: I0308 00:39:54.213892 23041 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:39:54.263356 master-0 kubenswrapper[23041]: I0308 00:39:54.263157 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 00:39:54.269102 master-0 kubenswrapper[23041]: I0308 00:39:54.269059 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 00:39:54.352893 master-0 kubenswrapper[23041]: I0308 00:39:54.352829 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 00:39:54.378405 master-0 kubenswrapper[23041]: I0308 00:39:54.378358 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:39:54.400497 master-0 kubenswrapper[23041]: I0308 00:39:54.400408 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:39:54.456365 master-0 kubenswrapper[23041]: I0308 00:39:54.456302 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:39:54.554202 master-0 kubenswrapper[23041]: I0308 00:39:54.554074 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 00:39:54.574581 master-0 kubenswrapper[23041]: I0308 00:39:54.574507 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:39:54.692137 master-0 kubenswrapper[23041]: I0308 00:39:54.692071 23041 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:39:54.692477 master-0 kubenswrapper[23041]: I0308 00:39:54.692386 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" containerID="cri-o://a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03" gracePeriod=5 Mar 08 00:39:54.746600 master-0 kubenswrapper[23041]: I0308 00:39:54.746538 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 00:39:54.902606 master-0 kubenswrapper[23041]: I0308 00:39:54.902499 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:39:55.083854 master-0 kubenswrapper[23041]: I0308 00:39:55.083796 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:39:55.109675 master-0 kubenswrapper[23041]: I0308 00:39:55.109607 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:39:55.131774 master-0 kubenswrapper[23041]: I0308 00:39:55.131695 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:39:55.136625 master-0 kubenswrapper[23041]: I0308 00:39:55.136576 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:39:55.150765 master-0 kubenswrapper[23041]: I0308 00:39:55.150671 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:39:55.207728 master-0 kubenswrapper[23041]: I0308 00:39:55.207561 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:39:55.311997 master-0 kubenswrapper[23041]: I0308 00:39:55.311564 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:39:55.335039 master-0 kubenswrapper[23041]: I0308 00:39:55.334942 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 00:39:55.362532 master-0 kubenswrapper[23041]: I0308 00:39:55.362483 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:39:55.401904 master-0 kubenswrapper[23041]: I0308 00:39:55.401843 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:39:55.476904 master-0 kubenswrapper[23041]: I0308 00:39:55.476799 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:39:55.495828 master-0 kubenswrapper[23041]: I0308 00:39:55.495797 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:39:55.521804 master-0 kubenswrapper[23041]: I0308 00:39:55.521770 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 00:39:55.584499 master-0 kubenswrapper[23041]: I0308 00:39:55.584374 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:39:55.664179 master-0 kubenswrapper[23041]: I0308 00:39:55.664091 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:39:55.676299 master-0 kubenswrapper[23041]: I0308 00:39:55.675396 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:39:55.748002 master-0 kubenswrapper[23041]: I0308 00:39:55.747951 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:39:55.795358 master-0 kubenswrapper[23041]: I0308 00:39:55.795272 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:39:55.914995 master-0 kubenswrapper[23041]: I0308 00:39:55.914949 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 00:39:55.929095 master-0 kubenswrapper[23041]: I0308 00:39:55.929056 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 00:39:55.948482 master-0 kubenswrapper[23041]: I0308 00:39:55.948425 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 00:39:55.957196 master-0 kubenswrapper[23041]: I0308 00:39:55.957137 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:39:55.981871 master-0 kubenswrapper[23041]: I0308 00:39:55.981539 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-wlrqc" Mar 08 00:39:56.014787 master-0 kubenswrapper[23041]: I0308 00:39:56.014649 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-kbbf9" Mar 08 00:39:56.073433 master-0 kubenswrapper[23041]: I0308 00:39:56.073383 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:39:56.190311 master-0 kubenswrapper[23041]: I0308 00:39:56.190251 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:39:56.310270 master-0 kubenswrapper[23041]: I0308 00:39:56.310195 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 00:39:56.355923 master-0 kubenswrapper[23041]: I0308 00:39:56.355841 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:39:56.403526 master-0 kubenswrapper[23041]: I0308 00:39:56.403444 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:39:56.404253 master-0 kubenswrapper[23041]: I0308 00:39:56.404111 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:39:56.499734 master-0 kubenswrapper[23041]: I0308 00:39:56.499630 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 00:39:56.605776 master-0 kubenswrapper[23041]: I0308 00:39:56.605630 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:39:56.617137 master-0 kubenswrapper[23041]: I0308 00:39:56.617063 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:39:56.628061 master-0 kubenswrapper[23041]: I0308 00:39:56.628021 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:39:56.702124 master-0 kubenswrapper[23041]: I0308 00:39:56.700973 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 00:39:56.733818 master-0 kubenswrapper[23041]: I0308 00:39:56.733409 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 00:39:56.747127 master-0 kubenswrapper[23041]: I0308 00:39:56.747099 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 00:39:56.859285 master-0 kubenswrapper[23041]: I0308 00:39:56.859119 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:39:56.903085 master-0 kubenswrapper[23041]: I0308 00:39:56.871779 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:39:56.945941 master-0 kubenswrapper[23041]: I0308 00:39:56.945885 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:39:57.155486 master-0 kubenswrapper[23041]: I0308 00:39:57.155350 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:39:57.173080 master-0 kubenswrapper[23041]: I0308 00:39:57.173024 23041 patch_prober.go:28] interesting pod/console-6479f6d896-j6kqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 00:39:57.173280 master-0 kubenswrapper[23041]: I0308 00:39:57.173076 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 00:39:57.264443 master-0 kubenswrapper[23041]: I0308 00:39:57.264399 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:39:57.313760 master-0 kubenswrapper[23041]: I0308 00:39:57.313700 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-njqpw" Mar 08 00:39:57.392184 master-0 kubenswrapper[23041]: I0308 00:39:57.392113 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:39:57.392458 master-0 kubenswrapper[23041]: I0308 00:39:57.392414 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:39:57.413819 master-0 kubenswrapper[23041]: I0308 00:39:57.413672 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 00:39:57.535942 master-0 kubenswrapper[23041]: I0308 00:39:57.535850 23041 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:39:57.697014 master-0 kubenswrapper[23041]: I0308 00:39:57.696930 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:39:57.894293 master-0 kubenswrapper[23041]: I0308 00:39:57.894188 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:39:57.927361 master-0 kubenswrapper[23041]: I0308 00:39:57.927180 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:39:58.011342 master-0 kubenswrapper[23041]: I0308 00:39:58.011288 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 00:39:58.019425 master-0 kubenswrapper[23041]: I0308 00:39:58.019385 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 00:39:58.047063 master-0 kubenswrapper[23041]: I0308 00:39:58.046622 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:39:58.197627 master-0 kubenswrapper[23041]: I0308 00:39:58.197490 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 08 00:39:58.368651 master-0 kubenswrapper[23041]: I0308 00:39:58.368581 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:39:58.626889 master-0 kubenswrapper[23041]: I0308 00:39:58.626804 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:39:58.917336 master-0 kubenswrapper[23041]: I0308 00:39:58.917192 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 00:39:59.147096 master-0 kubenswrapper[23041]: I0308 00:39:59.147034 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:39:59.385970 master-0 kubenswrapper[23041]: I0308 00:39:59.385909 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:39:59.434185 master-0 kubenswrapper[23041]: I0308 00:39:59.434117 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:39:59.434425 master-0 kubenswrapper[23041]: I0308 00:39:59.434210 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:39:59.449189 master-0 kubenswrapper[23041]: I0308 00:39:59.449133 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-wkgmq" Mar 08 00:39:59.837753 master-0 kubenswrapper[23041]: I0308 00:39:59.837707 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 00:39:59.837960 master-0 kubenswrapper[23041]: I0308 00:39:59.837787 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:39:59.955721 master-0 kubenswrapper[23041]: I0308 00:39:59.955519 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 00:39:59.955721 master-0 kubenswrapper[23041]: I0308 00:39:59.955601 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 00:39:59.955721 master-0 kubenswrapper[23041]: I0308 00:39:59.955631 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 00:39:59.956677 master-0 kubenswrapper[23041]: I0308 00:39:59.955841 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 00:39:59.956677 master-0 kubenswrapper[23041]: I0308 00:39:59.955905 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 00:39:59.956883 master-0 kubenswrapper[23041]: I0308 00:39:59.956856 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock" (OuterVolumeSpecName: "var-lock") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:59.956949 master-0 kubenswrapper[23041]: I0308 00:39:59.956900 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:59.957034 master-0 kubenswrapper[23041]: I0308 00:39:59.957010 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests" (OuterVolumeSpecName: "manifests") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:59.957096 master-0 kubenswrapper[23041]: I0308 00:39:59.957049 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log" (OuterVolumeSpecName: "var-log") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:39:59.958396 master-0 kubenswrapper[23041]: I0308 00:39:59.958367 23041 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:59.958396 master-0 kubenswrapper[23041]: I0308 00:39:59.958394 23041 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:59.958483 master-0 kubenswrapper[23041]: I0308 00:39:59.958407 23041 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:59.958483 master-0 kubenswrapper[23041]: I0308 00:39:59.958419 23041 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 00:39:59.967483 master-0 kubenswrapper[23041]: I0308 00:39:59.967405 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:40:00.059551 master-0 kubenswrapper[23041]: I0308 00:40:00.059483 23041 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:00.607377 master-0 kubenswrapper[23041]: I0308 00:40:00.607321 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 00:40:00.607377 master-0 kubenswrapper[23041]: I0308 00:40:00.607378 23041 generic.go:334] "Generic (PLEG): container finished" podID="a814bd60de133d95cf99630a978c017e" containerID="a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03" exitCode=137 Mar 08 00:40:00.608169 master-0 kubenswrapper[23041]: I0308 00:40:00.607428 23041 scope.go:117] "RemoveContainer" containerID="a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03" Mar 08 00:40:00.608169 master-0 kubenswrapper[23041]: I0308 00:40:00.607455 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 00:40:00.624974 master-0 kubenswrapper[23041]: I0308 00:40:00.624926 23041 scope.go:117] "RemoveContainer" containerID="a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03" Mar 08 00:40:00.625442 master-0 kubenswrapper[23041]: E0308 00:40:00.625379 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03\": container with ID starting with a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03 not found: ID does not exist" containerID="a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03" Mar 08 00:40:00.625508 master-0 kubenswrapper[23041]: I0308 00:40:00.625448 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03"} err="failed to get container status \"a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03\": rpc error: code = NotFound desc = could not find container \"a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03\": container with ID starting with a076b39f19134d81af0bd151f4f0b8f8c2a9f7a6c2a5b5a4719ba05826359e03 not found: ID does not exist" Mar 08 00:40:00.817634 master-0 kubenswrapper[23041]: I0308 00:40:00.817578 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814bd60de133d95cf99630a978c017e" path="/var/lib/kubelet/pods/a814bd60de133d95cf99630a978c017e/volumes" Mar 08 00:40:00.817938 master-0 kubenswrapper[23041]: I0308 00:40:00.817838 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 08 00:40:00.837218 master-0 kubenswrapper[23041]: I0308 00:40:00.837129 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:40:00.837218 master-0 kubenswrapper[23041]: I0308 00:40:00.837167 23041 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="74045be0-cd86-46bf-971e-1d3663d2b656" Mar 08 00:40:00.842111 master-0 kubenswrapper[23041]: I0308 00:40:00.842073 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 00:40:00.842111 master-0 kubenswrapper[23041]: I0308 00:40:00.842105 23041 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="74045be0-cd86-46bf-971e-1d3663d2b656" Mar 08 00:40:06.854563 master-0 kubenswrapper[23041]: I0308 00:40:06.854500 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:40:06.855677 master-0 kubenswrapper[23041]: I0308 00:40:06.855623 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:40:06.855677 master-0 kubenswrapper[23041]: I0308 00:40:06.855667 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:40:06.873357 master-0 kubenswrapper[23041]: I0308 00:40:06.873297 23041 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 00:40:06.882224 master-0 kubenswrapper[23041]: I0308 00:40:06.882033 23041 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd31aacf-8b72-4e88-bedc-d0c213078574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:40:06Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:40:06Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dc2f4a6281f86a5201b8fd72041698ad96d876d48752ea224039fe894db864c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://380503fd587956d881780c5e3c55b4b20a2d116b689282809e39f247ca9d9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49321dae68248a90140dcd4f07e44562e5b4db44455220da7ad5f43c6aee577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}]}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-master-0\": pods \"openshift-kube-scheduler-master-0\" not found" Mar 08 00:40:06.884348 master-0 kubenswrapper[23041]: I0308 00:40:06.884309 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:40:06.897034 master-0 kubenswrapper[23041]: I0308 00:40:06.896955 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:40:06.904335 master-0 kubenswrapper[23041]: I0308 00:40:06.904272 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 00:40:07.179572 master-0 kubenswrapper[23041]: I0308 00:40:07.179179 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:40:07.187651 master-0 kubenswrapper[23041]: I0308 00:40:07.187590 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:40:07.251332 master-0 kubenswrapper[23041]: I0308 00:40:07.251229 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=1.25118842 podStartE2EDuration="1.25118842s" podCreationTimestamp="2026-03-08 00:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:40:07.23635978 +0000 UTC m=+512.709196354" watchObservedRunningTime="2026-03-08 00:40:07.25118842 +0000 UTC m=+512.724024974" Mar 08 00:40:07.661020 master-0 kubenswrapper[23041]: I0308 00:40:07.660966 23041 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:40:07.661020 master-0 kubenswrapper[23041]: I0308 00:40:07.661001 23041 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="fd31aacf-8b72-4e88-bedc-d0c213078574" Mar 08 00:40:09.439910 master-0 kubenswrapper[23041]: I0308 00:40:09.439848 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:40:09.443729 master-0 kubenswrapper[23041]: I0308 00:40:09.443690 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:40:09.526848 master-0 kubenswrapper[23041]: I0308 00:40:09.526741 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:40:34.579075 master-0 kubenswrapper[23041]: I0308 00:40:34.578966 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6479f6d896-j6kqz" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" containerID="cri-o://192b68dedfebd4dee50599d2c7d025373ab38645d7fe97d9015fca7b54ac5478" gracePeriod=15 Mar 08 00:40:34.924723 master-0 kubenswrapper[23041]: I0308 00:40:34.924597 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6479f6d896-j6kqz_67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/console/1.log" Mar 08 00:40:34.925287 master-0 kubenswrapper[23041]: I0308 00:40:34.925224 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6479f6d896-j6kqz_67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/console/0.log" Mar 08 00:40:34.925287 master-0 kubenswrapper[23041]: I0308 00:40:34.925270 23041 generic.go:334] "Generic (PLEG): container finished" podID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerID="192b68dedfebd4dee50599d2c7d025373ab38645d7fe97d9015fca7b54ac5478" exitCode=2 Mar 08 00:40:34.925386 master-0 kubenswrapper[23041]: I0308 00:40:34.925303 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerDied","Data":"192b68dedfebd4dee50599d2c7d025373ab38645d7fe97d9015fca7b54ac5478"} Mar 08 00:40:34.925386 master-0 kubenswrapper[23041]: I0308 00:40:34.925346 23041 scope.go:117] "RemoveContainer" containerID="42d1b0d9a17b6b2ff8f7fdf2871fc4fcb4d92831ee2c4371c0b51fde6a93a0cf" Mar 08 00:40:35.076545 master-0 kubenswrapper[23041]: I0308 00:40:35.075933 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6479f6d896-j6kqz_67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/console/1.log" Mar 08 00:40:35.076545 master-0 kubenswrapper[23041]: I0308 00:40:35.076035 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:40:35.199743 master-0 kubenswrapper[23041]: I0308 00:40:35.198942 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.199743 master-0 kubenswrapper[23041]: I0308 00:40:35.199041 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.199743 master-0 kubenswrapper[23041]: I0308 00:40:35.199097 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztglc\" (UniqueName: \"kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.199743 master-0 kubenswrapper[23041]: I0308 00:40:35.199136 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.199743 master-0 kubenswrapper[23041]: I0308 00:40:35.199539 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca" (OuterVolumeSpecName: "service-ca") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:40:35.203661 master-0 kubenswrapper[23041]: I0308 00:40:35.199831 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:40:35.203661 master-0 kubenswrapper[23041]: I0308 00:40:35.199930 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.203661 master-0 kubenswrapper[23041]: I0308 00:40:35.200024 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.203661 master-0 kubenswrapper[23041]: I0308 00:40:35.200065 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert\") pod \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\" (UID: \"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b\") " Mar 08 00:40:35.203661 master-0 kubenswrapper[23041]: I0308 00:40:35.201513 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config" (OuterVolumeSpecName: "console-config") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.205108 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc" (OuterVolumeSpecName: "kube-api-access-ztglc") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "kube-api-access-ztglc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.200231 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.206175 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.206227 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.206239 23041 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.206262 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztglc\" (UniqueName: \"kubernetes.io/projected/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-kube-api-access-ztglc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.207126 master-0 kubenswrapper[23041]: I0308 00:40:35.206272 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.208031 master-0 kubenswrapper[23041]: I0308 00:40:35.207828 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:40:35.230060 master-0 kubenswrapper[23041]: I0308 00:40:35.229941 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" (UID: "67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:40:35.308236 master-0 kubenswrapper[23041]: I0308 00:40:35.308145 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.308236 master-0 kubenswrapper[23041]: I0308 00:40:35.308238 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:40:35.934861 master-0 kubenswrapper[23041]: I0308 00:40:35.934822 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6479f6d896-j6kqz_67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/console/1.log" Mar 08 00:40:35.935488 master-0 kubenswrapper[23041]: I0308 00:40:35.935462 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6479f6d896-j6kqz" event={"ID":"67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b","Type":"ContainerDied","Data":"e794297247665d49796affaaf41c36b9c7d953b2c1882b909e7bef0eadebff8a"} Mar 08 00:40:35.935588 master-0 kubenswrapper[23041]: I0308 00:40:35.935563 23041 scope.go:117] "RemoveContainer" containerID="192b68dedfebd4dee50599d2c7d025373ab38645d7fe97d9015fca7b54ac5478" Mar 08 00:40:35.935774 master-0 kubenswrapper[23041]: I0308 00:40:35.935761 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6479f6d896-j6kqz" Mar 08 00:40:35.987257 master-0 kubenswrapper[23041]: I0308 00:40:35.985589 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:40:35.994526 master-0 kubenswrapper[23041]: I0308 00:40:35.993386 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6479f6d896-j6kqz"] Mar 08 00:40:36.818333 master-0 kubenswrapper[23041]: I0308 00:40:36.818254 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" path="/var/lib/kubelet/pods/67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b/volumes" Mar 08 00:41:19.721051 master-0 kubenswrapper[23041]: I0308 00:41:19.720745 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: E0308 00:41:19.721949 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722012 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: E0308 00:41:19.722024 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722030 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: E0308 00:41:19.722040 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" containerName="installer" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722047 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" containerName="installer" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722244 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c386ff-5ff0-4937-b909-5f800abdb600" containerName="installer" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722257 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722267 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 00:41:19.722356 master-0 kubenswrapper[23041]: I0308 00:41:19.722286 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:41:19.722810 master-0 kubenswrapper[23041]: I0308 00:41:19.722770 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.730494 master-0 kubenswrapper[23041]: I0308 00:41:19.730414 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-q6gf6" Mar 08 00:41:19.730675 master-0 kubenswrapper[23041]: I0308 00:41:19.730426 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 00:41:19.744908 master-0 kubenswrapper[23041]: I0308 00:41:19.744836 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 08 00:41:19.762773 master-0 kubenswrapper[23041]: I0308 00:41:19.757548 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.762773 master-0 kubenswrapper[23041]: I0308 00:41:19.757864 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.861493 master-0 kubenswrapper[23041]: I0308 00:41:19.861396 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.862097 master-0 kubenswrapper[23041]: I0308 00:41:19.861479 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.862170 master-0 kubenswrapper[23041]: I0308 00:41:19.862082 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:19.899439 master-0 kubenswrapper[23041]: I0308 00:41:19.894941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:20.073875 master-0 kubenswrapper[23041]: I0308 00:41:20.073771 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:20.526922 master-0 kubenswrapper[23041]: I0308 00:41:20.526859 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 08 00:41:21.307374 master-0 kubenswrapper[23041]: I0308 00:41:21.305189 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"19d57203-85e2-4423-94ab-bbff7ff5c24b","Type":"ContainerStarted","Data":"0b07dcfe63b59a1500d002a8509e9dc56f2f91e3b557cb4d4fb00de3073c67cb"} Mar 08 00:41:21.307374 master-0 kubenswrapper[23041]: I0308 00:41:21.305319 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"19d57203-85e2-4423-94ab-bbff7ff5c24b","Type":"ContainerStarted","Data":"e75c4a3408e12bd7a0529ec05d3ea89d9b916861068bd1895dc70b4de4abc068"} Mar 08 00:41:21.339223 master-0 kubenswrapper[23041]: I0308 00:41:21.339096 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-master-0" podStartSLOduration=2.339061283 podStartE2EDuration="2.339061283s" podCreationTimestamp="2026-03-08 00:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:41:21.333081068 +0000 UTC m=+586.805917642" watchObservedRunningTime="2026-03-08 00:41:21.339061283 +0000 UTC m=+586.811897827" Mar 08 00:41:22.312570 master-0 kubenswrapper[23041]: I0308 00:41:22.312520 23041 generic.go:334] "Generic (PLEG): container finished" podID="19d57203-85e2-4423-94ab-bbff7ff5c24b" containerID="0b07dcfe63b59a1500d002a8509e9dc56f2f91e3b557cb4d4fb00de3073c67cb" exitCode=0 Mar 08 00:41:22.313110 master-0 kubenswrapper[23041]: I0308 00:41:22.312605 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"19d57203-85e2-4423-94ab-bbff7ff5c24b","Type":"ContainerDied","Data":"0b07dcfe63b59a1500d002a8509e9dc56f2f91e3b557cb4d4fb00de3073c67cb"} Mar 08 00:41:23.638007 master-0 kubenswrapper[23041]: I0308 00:41:23.637324 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:41:23.739010 master-0 kubenswrapper[23041]: I0308 00:41:23.738535 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access\") pod \"19d57203-85e2-4423-94ab-bbff7ff5c24b\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " Mar 08 00:41:23.739010 master-0 kubenswrapper[23041]: I0308 00:41:23.738729 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir\") pod \"19d57203-85e2-4423-94ab-bbff7ff5c24b\" (UID: \"19d57203-85e2-4423-94ab-bbff7ff5c24b\") " Mar 08 00:41:23.740604 master-0 kubenswrapper[23041]: I0308 00:41:23.738906 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "19d57203-85e2-4423-94ab-bbff7ff5c24b" (UID: "19d57203-85e2-4423-94ab-bbff7ff5c24b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:41:23.743726 master-0 kubenswrapper[23041]: I0308 00:41:23.743666 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "19d57203-85e2-4423-94ab-bbff7ff5c24b" (UID: "19d57203-85e2-4423-94ab-bbff7ff5c24b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:41:23.842973 master-0 kubenswrapper[23041]: I0308 00:41:23.842776 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19d57203-85e2-4423-94ab-bbff7ff5c24b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 00:41:23.842973 master-0 kubenswrapper[23041]: I0308 00:41:23.842836 23041 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19d57203-85e2-4423-94ab-bbff7ff5c24b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:41:24.331510 master-0 kubenswrapper[23041]: I0308 00:41:24.331432 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"19d57203-85e2-4423-94ab-bbff7ff5c24b","Type":"ContainerDied","Data":"e75c4a3408e12bd7a0529ec05d3ea89d9b916861068bd1895dc70b4de4abc068"} Mar 08 00:41:24.331510 master-0 kubenswrapper[23041]: I0308 00:41:24.331507 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75c4a3408e12bd7a0529ec05d3ea89d9b916861068bd1895dc70b4de4abc068" Mar 08 00:41:24.331799 master-0 kubenswrapper[23041]: I0308 00:41:24.331614 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 08 00:42:36.801979 master-0 kubenswrapper[23041]: I0308 00:42:36.801876 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:42:36.802883 master-0 kubenswrapper[23041]: E0308 00:42:36.802477 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:42:36.802883 master-0 kubenswrapper[23041]: I0308 00:42:36.802504 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e3ebe9-49b1-4c2b-8e98-8ac4bf9ec07b" containerName="console" Mar 08 00:42:36.802883 master-0 kubenswrapper[23041]: E0308 00:42:36.802529 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d57203-85e2-4423-94ab-bbff7ff5c24b" containerName="pruner" Mar 08 00:42:36.802883 master-0 kubenswrapper[23041]: I0308 00:42:36.802542 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d57203-85e2-4423-94ab-bbff7ff5c24b" containerName="pruner" Mar 08 00:42:36.802883 master-0 kubenswrapper[23041]: I0308 00:42:36.802828 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d57203-85e2-4423-94ab-bbff7ff5c24b" containerName="pruner" Mar 08 00:42:36.803922 master-0 kubenswrapper[23041]: I0308 00:42:36.803883 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:36.806789 master-0 kubenswrapper[23041]: I0308 00:42:36.806728 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 08 00:42:36.807093 master-0 kubenswrapper[23041]: I0308 00:42:36.807065 23041 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 08 00:42:36.814760 master-0 kubenswrapper[23041]: I0308 00:42:36.813521 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 08 00:42:36.818834 master-0 kubenswrapper[23041]: I0308 00:42:36.818775 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:42:36.819082 master-0 kubenswrapper[23041]: I0308 00:42:36.818976 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 08 00:42:36.939842 master-0 kubenswrapper[23041]: I0308 00:42:36.939721 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:36.940071 master-0 kubenswrapper[23041]: I0308 00:42:36.939957 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:36.940116 master-0 kubenswrapper[23041]: I0308 00:42:36.940080 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86ff\" (UniqueName: \"kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.041423 master-0 kubenswrapper[23041]: I0308 00:42:37.041358 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.041632 master-0 kubenswrapper[23041]: I0308 00:42:37.041441 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86ff\" (UniqueName: \"kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.041632 master-0 kubenswrapper[23041]: I0308 00:42:37.041494 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.042479 master-0 kubenswrapper[23041]: I0308 00:42:37.042443 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.049582 master-0 kubenswrapper[23041]: I0308 00:42:37.049518 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.057837 master-0 kubenswrapper[23041]: I0308 00:42:37.057770 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86ff\" (UniqueName: \"kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff\") pod \"sushy-emulator-78f6d7d749-mx5qs\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.138818 master-0 kubenswrapper[23041]: I0308 00:42:37.138740 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:37.560634 master-0 kubenswrapper[23041]: I0308 00:42:37.560580 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:42:37.564434 master-0 kubenswrapper[23041]: I0308 00:42:37.564374 23041 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:42:38.011708 master-0 kubenswrapper[23041]: I0308 00:42:38.011627 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" event={"ID":"4f21bb0f-7fc4-43de-9212-1685450891b3","Type":"ContainerStarted","Data":"319a1f16c2c3595d4f8e4beda3c218be05b8b0297c5dce7fd1c1cb9fcbe4306e"} Mar 08 00:42:46.084001 master-0 kubenswrapper[23041]: I0308 00:42:46.083844 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" event={"ID":"4f21bb0f-7fc4-43de-9212-1685450891b3","Type":"ContainerStarted","Data":"5c9bfa4d8345f65b9b28045656b02c8296a7bfb60ba78fb5d9472cdf870cb78b"} Mar 08 00:42:46.153559 master-0 kubenswrapper[23041]: I0308 00:42:46.153399 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" podStartSLOduration=2.180969234 podStartE2EDuration="10.153362744s" podCreationTimestamp="2026-03-08 00:42:36 +0000 UTC" firstStartedPulling="2026-03-08 00:42:37.564340646 +0000 UTC m=+663.037177210" lastFinishedPulling="2026-03-08 00:42:45.536734156 +0000 UTC m=+671.009570720" observedRunningTime="2026-03-08 00:42:46.147730526 +0000 UTC m=+671.620567120" watchObservedRunningTime="2026-03-08 00:42:46.153362744 +0000 UTC m=+671.626199308" Mar 08 00:42:47.142390 master-0 kubenswrapper[23041]: I0308 00:42:47.140586 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:47.142390 master-0 kubenswrapper[23041]: I0308 00:42:47.140683 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:47.152471 master-0 kubenswrapper[23041]: I0308 00:42:47.152379 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:42:48.110277 master-0 kubenswrapper[23041]: I0308 00:42:48.110162 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:43:06.464168 master-0 kubenswrapper[23041]: I0308 00:43:06.464065 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-5959594f9c-mqqwc"] Mar 08 00:43:06.468116 master-0 kubenswrapper[23041]: I0308 00:43:06.466008 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.480331 master-0 kubenswrapper[23041]: I0308 00:43:06.476043 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5959594f9c-mqqwc"] Mar 08 00:43:06.489088 master-0 kubenswrapper[23041]: I0308 00:43:06.489023 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/53bc837d-0a84-4044-bec0-d6ecd3223e91-os-client-config\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.489196 master-0 kubenswrapper[23041]: I0308 00:43:06.489133 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsbk\" (UniqueName: \"kubernetes.io/projected/53bc837d-0a84-4044-bec0-d6ecd3223e91-kube-api-access-bnsbk\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.590888 master-0 kubenswrapper[23041]: I0308 00:43:06.590801 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/53bc837d-0a84-4044-bec0-d6ecd3223e91-os-client-config\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.590888 master-0 kubenswrapper[23041]: I0308 00:43:06.590875 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsbk\" (UniqueName: \"kubernetes.io/projected/53bc837d-0a84-4044-bec0-d6ecd3223e91-kube-api-access-bnsbk\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.594736 master-0 kubenswrapper[23041]: I0308 00:43:06.594702 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/53bc837d-0a84-4044-bec0-d6ecd3223e91-os-client-config\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.607120 master-0 kubenswrapper[23041]: I0308 00:43:06.607075 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsbk\" (UniqueName: \"kubernetes.io/projected/53bc837d-0a84-4044-bec0-d6ecd3223e91-kube-api-access-bnsbk\") pod \"nova-console-poller-5959594f9c-mqqwc\" (UID: \"53bc837d-0a84-4044-bec0-d6ecd3223e91\") " pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:06.795469 master-0 kubenswrapper[23041]: I0308 00:43:06.795406 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" Mar 08 00:43:07.199458 master-0 kubenswrapper[23041]: I0308 00:43:07.199368 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5959594f9c-mqqwc"] Mar 08 00:43:07.205918 master-0 kubenswrapper[23041]: W0308 00:43:07.205856 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53bc837d_0a84_4044_bec0_d6ecd3223e91.slice/crio-207d9c83757eaf8ed950628b136c756435d21848b002b44a668df20780cc7235 WatchSource:0}: Error finding container 207d9c83757eaf8ed950628b136c756435d21848b002b44a668df20780cc7235: Status 404 returned error can't find the container with id 207d9c83757eaf8ed950628b136c756435d21848b002b44a668df20780cc7235 Mar 08 00:43:07.240923 master-0 kubenswrapper[23041]: I0308 00:43:07.240850 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" event={"ID":"53bc837d-0a84-4044-bec0-d6ecd3223e91","Type":"ContainerStarted","Data":"207d9c83757eaf8ed950628b136c756435d21848b002b44a668df20780cc7235"} Mar 08 00:43:13.295565 master-0 kubenswrapper[23041]: I0308 00:43:13.295457 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" event={"ID":"53bc837d-0a84-4044-bec0-d6ecd3223e91","Type":"ContainerStarted","Data":"e4de8c07ecc42e694da2c2725a26c4e006b93069ecd08858225e6010f12ce434"} Mar 08 00:43:14.310149 master-0 kubenswrapper[23041]: I0308 00:43:14.310068 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" event={"ID":"53bc837d-0a84-4044-bec0-d6ecd3223e91","Type":"ContainerStarted","Data":"3a60aad66a3bcecce68f8237fa86f959c1f3b1b8bd189ef03fd0eb95a3f05852"} Mar 08 00:43:14.339637 master-0 kubenswrapper[23041]: I0308 00:43:14.339511 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-5959594f9c-mqqwc" podStartSLOduration=2.231010648 podStartE2EDuration="8.339482853s" podCreationTimestamp="2026-03-08 00:43:06 +0000 UTC" firstStartedPulling="2026-03-08 00:43:07.208112653 +0000 UTC m=+692.680949207" lastFinishedPulling="2026-03-08 00:43:13.316584858 +0000 UTC m=+698.789421412" observedRunningTime="2026-03-08 00:43:14.332245785 +0000 UTC m=+699.805082369" watchObservedRunningTime="2026-03-08 00:43:14.339482853 +0000 UTC m=+699.812319417" Mar 08 00:43:38.135427 master-0 kubenswrapper[23041]: I0308 00:43:38.135240 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t"] Mar 08 00:43:38.137296 master-0 kubenswrapper[23041]: I0308 00:43:38.137070 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.178621 master-0 kubenswrapper[23041]: I0308 00:43:38.178566 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t"] Mar 08 00:43:38.234928 master-0 kubenswrapper[23041]: I0308 00:43:38.234857 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/3877af8f-1419-4c3b-9bb5-dcaeb859e043-nova-console-recordings-pv\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.235470 master-0 kubenswrapper[23041]: I0308 00:43:38.235444 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbqd\" (UniqueName: \"kubernetes.io/projected/3877af8f-1419-4c3b-9bb5-dcaeb859e043-kube-api-access-hpbqd\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.235637 master-0 kubenswrapper[23041]: I0308 00:43:38.235618 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/3877af8f-1419-4c3b-9bb5-dcaeb859e043-os-client-config\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.338122 master-0 kubenswrapper[23041]: I0308 00:43:38.338029 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/3877af8f-1419-4c3b-9bb5-dcaeb859e043-nova-console-recordings-pv\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.338561 master-0 kubenswrapper[23041]: I0308 00:43:38.338261 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbqd\" (UniqueName: \"kubernetes.io/projected/3877af8f-1419-4c3b-9bb5-dcaeb859e043-kube-api-access-hpbqd\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.338561 master-0 kubenswrapper[23041]: I0308 00:43:38.338339 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/3877af8f-1419-4c3b-9bb5-dcaeb859e043-os-client-config\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.343959 master-0 kubenswrapper[23041]: I0308 00:43:38.343890 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/3877af8f-1419-4c3b-9bb5-dcaeb859e043-os-client-config\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:38.366375 master-0 kubenswrapper[23041]: I0308 00:43:38.366256 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbqd\" (UniqueName: \"kubernetes.io/projected/3877af8f-1419-4c3b-9bb5-dcaeb859e043-kube-api-access-hpbqd\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:39.043536 master-0 kubenswrapper[23041]: I0308 00:43:39.043444 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/3877af8f-1419-4c3b-9bb5-dcaeb859e043-nova-console-recordings-pv\") pod \"nova-console-recorder-7bdc7f66d5-t9l4t\" (UID: \"3877af8f-1419-4c3b-9bb5-dcaeb859e043\") " pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:39.076592 master-0 kubenswrapper[23041]: I0308 00:43:39.076497 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" Mar 08 00:43:39.557709 master-0 kubenswrapper[23041]: I0308 00:43:39.557571 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t"] Mar 08 00:43:39.559150 master-0 kubenswrapper[23041]: W0308 00:43:39.559067 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3877af8f_1419_4c3b_9bb5_dcaeb859e043.slice/crio-af177c27c097da2e53399191363ab104ddad2be01baf183844c30653492094d5 WatchSource:0}: Error finding container af177c27c097da2e53399191363ab104ddad2be01baf183844c30653492094d5: Status 404 returned error can't find the container with id af177c27c097da2e53399191363ab104ddad2be01baf183844c30653492094d5 Mar 08 00:43:40.185388 master-0 kubenswrapper[23041]: I0308 00:43:40.185286 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" event={"ID":"3877af8f-1419-4c3b-9bb5-dcaeb859e043","Type":"ContainerStarted","Data":"af177c27c097da2e53399191363ab104ddad2be01baf183844c30653492094d5"} Mar 08 00:43:49.272611 master-0 kubenswrapper[23041]: I0308 00:43:49.272496 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" event={"ID":"3877af8f-1419-4c3b-9bb5-dcaeb859e043","Type":"ContainerStarted","Data":"8a2e30435bb9af462250d078ecabac710466f71fe329a3775b7f55497c7891ec"} Mar 08 00:43:50.281039 master-0 kubenswrapper[23041]: I0308 00:43:50.280979 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" event={"ID":"3877af8f-1419-4c3b-9bb5-dcaeb859e043","Type":"ContainerStarted","Data":"a96739e226ac260a953e4c9d01e9e3712e5ceb6ed57135b764590c700092fc68"} Mar 08 00:45:35.701583 master-0 kubenswrapper[23041]: I0308 00:45:35.701492 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-7bdc7f66d5-t9l4t" podStartSLOduration=107.80477452 podStartE2EDuration="1m57.701466406s" podCreationTimestamp="2026-03-08 00:43:38 +0000 UTC" firstStartedPulling="2026-03-08 00:43:39.562414397 +0000 UTC m=+725.035250961" lastFinishedPulling="2026-03-08 00:43:49.459106283 +0000 UTC m=+734.931942847" observedRunningTime="2026-03-08 00:43:50.306484771 +0000 UTC m=+735.779321335" watchObservedRunningTime="2026-03-08 00:45:35.701466406 +0000 UTC m=+841.174302960" Mar 08 00:45:35.702579 master-0 kubenswrapper[23041]: I0308 00:45:35.702549 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths"] Mar 08 00:45:35.704133 master-0 kubenswrapper[23041]: I0308 00:45:35.704101 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.715369 master-0 kubenswrapper[23041]: I0308 00:45:35.715317 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths"] Mar 08 00:45:35.793400 master-0 kubenswrapper[23041]: I0308 00:45:35.793312 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.793400 master-0 kubenswrapper[23041]: I0308 00:45:35.793395 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.793699 master-0 kubenswrapper[23041]: I0308 00:45:35.793437 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kck5x\" (UniqueName: \"kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.894739 master-0 kubenswrapper[23041]: I0308 00:45:35.894658 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.894739 master-0 kubenswrapper[23041]: I0308 00:45:35.894748 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.895038 master-0 kubenswrapper[23041]: I0308 00:45:35.894797 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kck5x\" (UniqueName: \"kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.895658 master-0 kubenswrapper[23041]: I0308 00:45:35.895350 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.895658 master-0 kubenswrapper[23041]: I0308 00:45:35.895599 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:35.912017 master-0 kubenswrapper[23041]: I0308 00:45:35.911967 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kck5x\" (UniqueName: \"kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:36.026461 master-0 kubenswrapper[23041]: I0308 00:45:36.026404 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:36.487493 master-0 kubenswrapper[23041]: I0308 00:45:36.487446 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths"] Mar 08 00:45:36.525173 master-0 kubenswrapper[23041]: I0308 00:45:36.525105 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" event={"ID":"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf","Type":"ContainerStarted","Data":"a8c6b32404d3b9fe6623d516f60f04ef014f3f2f0480a84865c3360605442892"} Mar 08 00:45:37.538896 master-0 kubenswrapper[23041]: I0308 00:45:37.538838 23041 generic.go:334] "Generic (PLEG): container finished" podID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerID="28f9f6f72d6a2f14e6dc364f6baa2bc74655a194f512417aff05d18ae48c2d20" exitCode=0 Mar 08 00:45:37.539533 master-0 kubenswrapper[23041]: I0308 00:45:37.539485 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" event={"ID":"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf","Type":"ContainerDied","Data":"28f9f6f72d6a2f14e6dc364f6baa2bc74655a194f512417aff05d18ae48c2d20"} Mar 08 00:45:39.553230 master-0 kubenswrapper[23041]: I0308 00:45:39.553134 23041 generic.go:334] "Generic (PLEG): container finished" podID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerID="cde781af0c4aaf6c6940a54cebc203f1ecb0f9a3e073edb355af75a7ee0fa2cf" exitCode=0 Mar 08 00:45:39.553886 master-0 kubenswrapper[23041]: I0308 00:45:39.553188 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" event={"ID":"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf","Type":"ContainerDied","Data":"cde781af0c4aaf6c6940a54cebc203f1ecb0f9a3e073edb355af75a7ee0fa2cf"} Mar 08 00:45:40.569886 master-0 kubenswrapper[23041]: I0308 00:45:40.569797 23041 generic.go:334] "Generic (PLEG): container finished" podID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerID="e74e72e13da8dd8db1cee23d809b395846b8cb61365c3b4a02069c1c26a5158e" exitCode=0 Mar 08 00:45:40.569886 master-0 kubenswrapper[23041]: I0308 00:45:40.569874 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" event={"ID":"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf","Type":"ContainerDied","Data":"e74e72e13da8dd8db1cee23d809b395846b8cb61365c3b4a02069c1c26a5158e"} Mar 08 00:45:41.890297 master-0 kubenswrapper[23041]: I0308 00:45:41.890227 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:41.991125 master-0 kubenswrapper[23041]: I0308 00:45:41.991021 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle\") pod \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " Mar 08 00:45:41.991386 master-0 kubenswrapper[23041]: I0308 00:45:41.991255 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util\") pod \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " Mar 08 00:45:41.991386 master-0 kubenswrapper[23041]: I0308 00:45:41.991311 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kck5x\" (UniqueName: \"kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x\") pod \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\" (UID: \"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf\") " Mar 08 00:45:41.992152 master-0 kubenswrapper[23041]: I0308 00:45:41.992097 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle" (OuterVolumeSpecName: "bundle") pod "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" (UID: "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:45:42.089624 master-0 kubenswrapper[23041]: I0308 00:45:42.089517 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x" (OuterVolumeSpecName: "kube-api-access-kck5x") pod "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" (UID: "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf"). InnerVolumeSpecName "kube-api-access-kck5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:45:42.093075 master-0 kubenswrapper[23041]: I0308 00:45:42.093006 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kck5x\" (UniqueName: \"kubernetes.io/projected/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-kube-api-access-kck5x\") on node \"master-0\" DevicePath \"\"" Mar 08 00:45:42.093075 master-0 kubenswrapper[23041]: I0308 00:45:42.093058 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:45:42.299531 master-0 kubenswrapper[23041]: I0308 00:45:42.299457 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util" (OuterVolumeSpecName: "util") pod "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" (UID: "3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:45:42.396428 master-0 kubenswrapper[23041]: I0308 00:45:42.396321 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:45:42.584653 master-0 kubenswrapper[23041]: I0308 00:45:42.584567 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" event={"ID":"3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf","Type":"ContainerDied","Data":"a8c6b32404d3b9fe6623d516f60f04ef014f3f2f0480a84865c3360605442892"} Mar 08 00:45:42.584653 master-0 kubenswrapper[23041]: I0308 00:45:42.584633 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c6b32404d3b9fe6623d516f60f04ef014f3f2f0480a84865c3360605442892" Mar 08 00:45:42.584653 master-0 kubenswrapper[23041]: I0308 00:45:42.584650 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4zjths" Mar 08 00:45:48.958044 master-0 kubenswrapper[23041]: I0308 00:45:48.957980 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-fcd55dd45-6z56x"] Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: E0308 00:45:48.965809 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="pull" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: I0308 00:45:48.965869 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="pull" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: E0308 00:45:48.965908 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="extract" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: I0308 00:45:48.965915 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="extract" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: E0308 00:45:48.965935 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="util" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: I0308 00:45:48.965941 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="util" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: I0308 00:45:48.966101 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb4f179-44d5-4fd1-9dbb-e1d576b0cabf" containerName="extract" Mar 08 00:45:48.969268 master-0 kubenswrapper[23041]: I0308 00:45:48.966715 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:48.969821 master-0 kubenswrapper[23041]: I0308 00:45:48.969668 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 08 00:45:48.973239 master-0 kubenswrapper[23041]: I0308 00:45:48.970029 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 08 00:45:48.973239 master-0 kubenswrapper[23041]: I0308 00:45:48.970187 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 08 00:45:48.973239 master-0 kubenswrapper[23041]: I0308 00:45:48.970335 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 08 00:45:48.973239 master-0 kubenswrapper[23041]: I0308 00:45:48.971373 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 08 00:45:48.985634 master-0 kubenswrapper[23041]: I0308 00:45:48.985572 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-fcd55dd45-6z56x"] Mar 08 00:45:49.111221 master-0 kubenswrapper[23041]: I0308 00:45:49.111143 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f340eeda-004c-4e90-b22c-ee94accbda26-socket-dir\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.111475 master-0 kubenswrapper[23041]: I0308 00:45:49.111273 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-metrics-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.111475 master-0 kubenswrapper[23041]: I0308 00:45:49.111291 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-webhook-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.111570 master-0 kubenswrapper[23041]: I0308 00:45:49.111490 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-apiservice-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.111881 master-0 kubenswrapper[23041]: I0308 00:45:49.111835 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqs4w\" (UniqueName: \"kubernetes.io/projected/f340eeda-004c-4e90-b22c-ee94accbda26-kube-api-access-gqs4w\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.213730 master-0 kubenswrapper[23041]: I0308 00:45:49.213532 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqs4w\" (UniqueName: \"kubernetes.io/projected/f340eeda-004c-4e90-b22c-ee94accbda26-kube-api-access-gqs4w\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.213730 master-0 kubenswrapper[23041]: I0308 00:45:49.213652 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f340eeda-004c-4e90-b22c-ee94accbda26-socket-dir\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.213730 master-0 kubenswrapper[23041]: I0308 00:45:49.213720 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-metrics-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.214280 master-0 kubenswrapper[23041]: I0308 00:45:49.213750 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-webhook-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.214280 master-0 kubenswrapper[23041]: I0308 00:45:49.213797 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-apiservice-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.214997 master-0 kubenswrapper[23041]: I0308 00:45:49.214948 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f340eeda-004c-4e90-b22c-ee94accbda26-socket-dir\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.218537 master-0 kubenswrapper[23041]: I0308 00:45:49.218481 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-metrics-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.219174 master-0 kubenswrapper[23041]: I0308 00:45:49.219085 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-webhook-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.221145 master-0 kubenswrapper[23041]: I0308 00:45:49.221093 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f340eeda-004c-4e90-b22c-ee94accbda26-apiservice-cert\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.232092 master-0 kubenswrapper[23041]: I0308 00:45:49.232034 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqs4w\" (UniqueName: \"kubernetes.io/projected/f340eeda-004c-4e90-b22c-ee94accbda26-kube-api-access-gqs4w\") pod \"lvms-operator-fcd55dd45-6z56x\" (UID: \"f340eeda-004c-4e90-b22c-ee94accbda26\") " pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.291232 master-0 kubenswrapper[23041]: I0308 00:45:49.289535 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:49.833879 master-0 kubenswrapper[23041]: I0308 00:45:49.833834 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-fcd55dd45-6z56x"] Mar 08 00:45:49.837137 master-0 kubenswrapper[23041]: W0308 00:45:49.837047 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf340eeda_004c_4e90_b22c_ee94accbda26.slice/crio-df1f593272dca13b9d8e00487db077753cc347f422d91106bf67669090f18147 WatchSource:0}: Error finding container df1f593272dca13b9d8e00487db077753cc347f422d91106bf67669090f18147: Status 404 returned error can't find the container with id df1f593272dca13b9d8e00487db077753cc347f422d91106bf67669090f18147 Mar 08 00:45:50.645077 master-0 kubenswrapper[23041]: I0308 00:45:50.645023 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" event={"ID":"f340eeda-004c-4e90-b22c-ee94accbda26","Type":"ContainerStarted","Data":"df1f593272dca13b9d8e00487db077753cc347f422d91106bf67669090f18147"} Mar 08 00:45:55.693103 master-0 kubenswrapper[23041]: I0308 00:45:55.693047 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" event={"ID":"f340eeda-004c-4e90-b22c-ee94accbda26","Type":"ContainerStarted","Data":"79f2ecda895aa17e03cf14c6fdd26291b5bb2d99f885475da4e4768f5d458f98"} Mar 08 00:45:55.693837 master-0 kubenswrapper[23041]: I0308 00:45:55.693250 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:45:55.718301 master-0 kubenswrapper[23041]: I0308 00:45:55.718181 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" podStartSLOduration=2.488442475 podStartE2EDuration="7.718160328s" podCreationTimestamp="2026-03-08 00:45:48 +0000 UTC" firstStartedPulling="2026-03-08 00:45:49.839478334 +0000 UTC m=+855.312314898" lastFinishedPulling="2026-03-08 00:45:55.069196177 +0000 UTC m=+860.542032751" observedRunningTime="2026-03-08 00:45:55.715076493 +0000 UTC m=+861.187913067" watchObservedRunningTime="2026-03-08 00:45:55.718160328 +0000 UTC m=+861.190996882" Mar 08 00:45:56.704679 master-0 kubenswrapper[23041]: I0308 00:45:56.704616 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-fcd55dd45-6z56x" Mar 08 00:46:00.216709 master-0 kubenswrapper[23041]: I0308 00:46:00.216582 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h"] Mar 08 00:46:00.218742 master-0 kubenswrapper[23041]: I0308 00:46:00.218677 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.232180 master-0 kubenswrapper[23041]: I0308 00:46:00.232115 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h"] Mar 08 00:46:00.324727 master-0 kubenswrapper[23041]: I0308 00:46:00.324658 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz947\" (UniqueName: \"kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.325020 master-0 kubenswrapper[23041]: I0308 00:46:00.324800 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.325020 master-0 kubenswrapper[23041]: I0308 00:46:00.324833 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.425943 master-0 kubenswrapper[23041]: I0308 00:46:00.425866 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.426286 master-0 kubenswrapper[23041]: I0308 00:46:00.425965 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.426286 master-0 kubenswrapper[23041]: I0308 00:46:00.426019 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz947\" (UniqueName: \"kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.426535 master-0 kubenswrapper[23041]: I0308 00:46:00.426495 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.426714 master-0 kubenswrapper[23041]: I0308 00:46:00.426671 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.466232 master-0 kubenswrapper[23041]: I0308 00:46:00.465977 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz947\" (UniqueName: \"kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.534464 master-0 kubenswrapper[23041]: I0308 00:46:00.534412 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:00.616633 master-0 kubenswrapper[23041]: I0308 00:46:00.616575 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8"] Mar 08 00:46:00.634270 master-0 kubenswrapper[23041]: I0308 00:46:00.619831 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.634270 master-0 kubenswrapper[23041]: I0308 00:46:00.631122 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8"] Mar 08 00:46:00.734163 master-0 kubenswrapper[23041]: I0308 00:46:00.734104 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.734638 master-0 kubenswrapper[23041]: I0308 00:46:00.734581 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v52qt\" (UniqueName: \"kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.734974 master-0 kubenswrapper[23041]: I0308 00:46:00.734928 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.837523 master-0 kubenswrapper[23041]: I0308 00:46:00.837177 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.837859 master-0 kubenswrapper[23041]: I0308 00:46:00.837753 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v52qt\" (UniqueName: \"kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.838002 master-0 kubenswrapper[23041]: I0308 00:46:00.837949 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.838700 master-0 kubenswrapper[23041]: I0308 00:46:00.838662 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.838879 master-0 kubenswrapper[23041]: I0308 00:46:00.838806 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.856025 master-0 kubenswrapper[23041]: I0308 00:46:00.855678 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v52qt\" (UniqueName: \"kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:00.930982 master-0 kubenswrapper[23041]: I0308 00:46:00.930914 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h"] Mar 08 00:46:00.932661 master-0 kubenswrapper[23041]: W0308 00:46:00.932599 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ad2056_7e0e_4c58_997a_50f86cf2384a.slice/crio-de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2 WatchSource:0}: Error finding container de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2: Status 404 returned error can't find the container with id de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2 Mar 08 00:46:00.945489 master-0 kubenswrapper[23041]: I0308 00:46:00.945420 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:01.672846 master-0 kubenswrapper[23041]: I0308 00:46:01.672781 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8"] Mar 08 00:46:01.680952 master-0 kubenswrapper[23041]: W0308 00:46:01.680875 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43c2152_0b42_449e_8649_44b77a0affb4.slice/crio-fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243 WatchSource:0}: Error finding container fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243: Status 404 returned error can't find the container with id fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243 Mar 08 00:46:01.764122 master-0 kubenswrapper[23041]: I0308 00:46:01.763989 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerStarted","Data":"fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243"} Mar 08 00:46:01.777824 master-0 kubenswrapper[23041]: I0308 00:46:01.776677 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerDied","Data":"9a60502ae02e96f6a16715ae952102f83e912fc8d8470919795e7a30638e8de5"} Mar 08 00:46:01.777824 master-0 kubenswrapper[23041]: I0308 00:46:01.776605 23041 generic.go:334] "Generic (PLEG): container finished" podID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerID="9a60502ae02e96f6a16715ae952102f83e912fc8d8470919795e7a30638e8de5" exitCode=0 Mar 08 00:46:01.777824 master-0 kubenswrapper[23041]: I0308 00:46:01.777417 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerStarted","Data":"de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2"} Mar 08 00:46:02.598371 master-0 kubenswrapper[23041]: I0308 00:46:02.598299 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76"] Mar 08 00:46:02.600010 master-0 kubenswrapper[23041]: I0308 00:46:02.599971 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.615894 master-0 kubenswrapper[23041]: I0308 00:46:02.615823 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76"] Mar 08 00:46:02.664176 master-0 kubenswrapper[23041]: I0308 00:46:02.664085 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjqg\" (UniqueName: \"kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.664176 master-0 kubenswrapper[23041]: I0308 00:46:02.664134 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.664795 master-0 kubenswrapper[23041]: I0308 00:46:02.664717 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.767533 master-0 kubenswrapper[23041]: I0308 00:46:02.767476 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjqg\" (UniqueName: \"kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.768387 master-0 kubenswrapper[23041]: I0308 00:46:02.768353 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.768831 master-0 kubenswrapper[23041]: I0308 00:46:02.768801 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.769051 master-0 kubenswrapper[23041]: I0308 00:46:02.768995 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.769235 master-0 kubenswrapper[23041]: I0308 00:46:02.769168 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.791388 master-0 kubenswrapper[23041]: I0308 00:46:02.791322 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjqg\" (UniqueName: \"kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:02.795317 master-0 kubenswrapper[23041]: I0308 00:46:02.795250 23041 generic.go:334] "Generic (PLEG): container finished" podID="b43c2152-0b42-449e-8649-44b77a0affb4" containerID="609be4e5c0de1f98ff6dd4c2b78bb3133e247b6616fe600e141283d73adf584a" exitCode=0 Mar 08 00:46:02.795317 master-0 kubenswrapper[23041]: I0308 00:46:02.795311 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerDied","Data":"609be4e5c0de1f98ff6dd4c2b78bb3133e247b6616fe600e141283d73adf584a"} Mar 08 00:46:02.930364 master-0 kubenswrapper[23041]: I0308 00:46:02.930195 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:03.508652 master-0 kubenswrapper[23041]: I0308 00:46:03.508250 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76"] Mar 08 00:46:03.516520 master-0 kubenswrapper[23041]: W0308 00:46:03.516049 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5cf42d_f955_443d_985a_4f3463067b7e.slice/crio-463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a WatchSource:0}: Error finding container 463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a: Status 404 returned error can't find the container with id 463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a Mar 08 00:46:03.804237 master-0 kubenswrapper[23041]: I0308 00:46:03.803934 23041 generic.go:334] "Generic (PLEG): container finished" podID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerID="b52005bb4af44fa04edc92608edfb08507b372c060d5e78752e8baa81559f045" exitCode=0 Mar 08 00:46:03.804237 master-0 kubenswrapper[23041]: I0308 00:46:03.804011 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerDied","Data":"b52005bb4af44fa04edc92608edfb08507b372c060d5e78752e8baa81559f045"} Mar 08 00:46:03.804237 master-0 kubenswrapper[23041]: I0308 00:46:03.804083 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerStarted","Data":"463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a"} Mar 08 00:46:05.824102 master-0 kubenswrapper[23041]: I0308 00:46:05.824025 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerStarted","Data":"69e0bff7e338a2439e7c9cc3e82537e1ba07bce2154408e210719250236cb709"} Mar 08 00:46:05.829598 master-0 kubenswrapper[23041]: I0308 00:46:05.829542 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerStarted","Data":"a209b84aa788493e9c5c7374f0154023cf5cdc0b2b9663d08f4ab8a43045b532"} Mar 08 00:46:05.832035 master-0 kubenswrapper[23041]: I0308 00:46:05.831994 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerStarted","Data":"045931e8ae19078f36ac1e7a7c476f70ffd78369d510091d19b4c83d7f491abe"} Mar 08 00:46:06.194290 master-0 kubenswrapper[23041]: I0308 00:46:06.194056 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz"] Mar 08 00:46:06.200386 master-0 kubenswrapper[23041]: I0308 00:46:06.196759 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.200520 master-0 kubenswrapper[23041]: I0308 00:46:06.200486 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz"] Mar 08 00:46:06.227440 master-0 kubenswrapper[23041]: I0308 00:46:06.227374 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr4n9\" (UniqueName: \"kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.227638 master-0 kubenswrapper[23041]: I0308 00:46:06.227485 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.227638 master-0 kubenswrapper[23041]: I0308 00:46:06.227589 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.329515 master-0 kubenswrapper[23041]: I0308 00:46:06.329397 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr4n9\" (UniqueName: \"kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.329515 master-0 kubenswrapper[23041]: I0308 00:46:06.329467 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.329515 master-0 kubenswrapper[23041]: I0308 00:46:06.329512 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.330032 master-0 kubenswrapper[23041]: I0308 00:46:06.330003 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.330770 master-0 kubenswrapper[23041]: I0308 00:46:06.330726 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.345521 master-0 kubenswrapper[23041]: I0308 00:46:06.345474 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr4n9\" (UniqueName: \"kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.515291 master-0 kubenswrapper[23041]: I0308 00:46:06.515240 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:06.845720 master-0 kubenswrapper[23041]: I0308 00:46:06.845583 23041 generic.go:334] "Generic (PLEG): container finished" podID="b43c2152-0b42-449e-8649-44b77a0affb4" containerID="045931e8ae19078f36ac1e7a7c476f70ffd78369d510091d19b4c83d7f491abe" exitCode=0 Mar 08 00:46:06.845720 master-0 kubenswrapper[23041]: I0308 00:46:06.845686 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerDied","Data":"045931e8ae19078f36ac1e7a7c476f70ffd78369d510091d19b4c83d7f491abe"} Mar 08 00:46:06.848950 master-0 kubenswrapper[23041]: I0308 00:46:06.848878 23041 generic.go:334] "Generic (PLEG): container finished" podID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerID="69e0bff7e338a2439e7c9cc3e82537e1ba07bce2154408e210719250236cb709" exitCode=0 Mar 08 00:46:06.849572 master-0 kubenswrapper[23041]: I0308 00:46:06.848924 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerDied","Data":"69e0bff7e338a2439e7c9cc3e82537e1ba07bce2154408e210719250236cb709"} Mar 08 00:46:06.851555 master-0 kubenswrapper[23041]: I0308 00:46:06.851506 23041 generic.go:334] "Generic (PLEG): container finished" podID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerID="a209b84aa788493e9c5c7374f0154023cf5cdc0b2b9663d08f4ab8a43045b532" exitCode=0 Mar 08 00:46:06.851634 master-0 kubenswrapper[23041]: I0308 00:46:06.851558 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerDied","Data":"a209b84aa788493e9c5c7374f0154023cf5cdc0b2b9663d08f4ab8a43045b532"} Mar 08 00:46:06.947649 master-0 kubenswrapper[23041]: I0308 00:46:06.947592 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz"] Mar 08 00:46:06.955065 master-0 kubenswrapper[23041]: W0308 00:46:06.955022 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe65d415_e10c_4257_a9d8_d392c20f2a9d.slice/crio-e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc WatchSource:0}: Error finding container e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc: Status 404 returned error can't find the container with id e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc Mar 08 00:46:07.861135 master-0 kubenswrapper[23041]: I0308 00:46:07.861002 23041 generic.go:334] "Generic (PLEG): container finished" podID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerID="f9bf92e550a59fc53804207efa5cd79a72a3102938898a1780d0b17a60d0bef3" exitCode=0 Mar 08 00:46:07.861135 master-0 kubenswrapper[23041]: I0308 00:46:07.861084 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerDied","Data":"f9bf92e550a59fc53804207efa5cd79a72a3102938898a1780d0b17a60d0bef3"} Mar 08 00:46:07.863600 master-0 kubenswrapper[23041]: I0308 00:46:07.863535 23041 generic.go:334] "Generic (PLEG): container finished" podID="b43c2152-0b42-449e-8649-44b77a0affb4" containerID="41ad30c6c4864e632ec4a0ff0b6959845545d367fd7350c3b69c2ad5d2e4fc8c" exitCode=0 Mar 08 00:46:07.863685 master-0 kubenswrapper[23041]: I0308 00:46:07.863660 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerDied","Data":"41ad30c6c4864e632ec4a0ff0b6959845545d367fd7350c3b69c2ad5d2e4fc8c"} Mar 08 00:46:07.865525 master-0 kubenswrapper[23041]: I0308 00:46:07.865471 23041 generic.go:334] "Generic (PLEG): container finished" podID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerID="1464f1efefc1292aa2e6a198ff5d6e8064d7fc3690d0927ae30559b8eaa253b9" exitCode=0 Mar 08 00:46:07.865628 master-0 kubenswrapper[23041]: I0308 00:46:07.865564 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" event={"ID":"fe65d415-e10c-4257-a9d8-d392c20f2a9d","Type":"ContainerDied","Data":"1464f1efefc1292aa2e6a198ff5d6e8064d7fc3690d0927ae30559b8eaa253b9"} Mar 08 00:46:07.865628 master-0 kubenswrapper[23041]: I0308 00:46:07.865601 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" event={"ID":"fe65d415-e10c-4257-a9d8-d392c20f2a9d","Type":"ContainerStarted","Data":"e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc"} Mar 08 00:46:07.869584 master-0 kubenswrapper[23041]: I0308 00:46:07.869545 23041 generic.go:334] "Generic (PLEG): container finished" podID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerID="266be903c9effa6b978a48a188b5fdd9978ab791e4ff85d4377a541cb68a5f6e" exitCode=0 Mar 08 00:46:07.869696 master-0 kubenswrapper[23041]: I0308 00:46:07.869600 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerDied","Data":"266be903c9effa6b978a48a188b5fdd9978ab791e4ff85d4377a541cb68a5f6e"} Mar 08 00:46:09.434479 master-0 kubenswrapper[23041]: I0308 00:46:09.433669 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:09.452782 master-0 kubenswrapper[23041]: I0308 00:46:09.452658 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:09.470850 master-0 kubenswrapper[23041]: I0308 00:46:09.470804 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:09.485150 master-0 kubenswrapper[23041]: I0308 00:46:09.485109 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle\") pod \"87ad2056-7e0e-4c58-997a-50f86cf2384a\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " Mar 08 00:46:09.485250 master-0 kubenswrapper[23041]: I0308 00:46:09.485172 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util\") pod \"b43c2152-0b42-449e-8649-44b77a0affb4\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " Mar 08 00:46:09.486382 master-0 kubenswrapper[23041]: I0308 00:46:09.485196 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjqg\" (UniqueName: \"kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg\") pod \"bf5cf42d-f955-443d-985a-4f3463067b7e\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " Mar 08 00:46:09.486769 master-0 kubenswrapper[23041]: I0308 00:46:09.486751 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz947\" (UniqueName: \"kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947\") pod \"87ad2056-7e0e-4c58-997a-50f86cf2384a\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " Mar 08 00:46:09.490745 master-0 kubenswrapper[23041]: I0308 00:46:09.490720 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle\") pod \"bf5cf42d-f955-443d-985a-4f3463067b7e\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " Mar 08 00:46:09.490914 master-0 kubenswrapper[23041]: I0308 00:46:09.486419 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle" (OuterVolumeSpecName: "bundle") pod "87ad2056-7e0e-4c58-997a-50f86cf2384a" (UID: "87ad2056-7e0e-4c58-997a-50f86cf2384a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.490914 master-0 kubenswrapper[23041]: I0308 00:46:09.490865 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util\") pod \"87ad2056-7e0e-4c58-997a-50f86cf2384a\" (UID: \"87ad2056-7e0e-4c58-997a-50f86cf2384a\") " Mar 08 00:46:09.491002 master-0 kubenswrapper[23041]: I0308 00:46:09.490278 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg" (OuterVolumeSpecName: "kube-api-access-zqjqg") pod "bf5cf42d-f955-443d-985a-4f3463067b7e" (UID: "bf5cf42d-f955-443d-985a-4f3463067b7e"). InnerVolumeSpecName "kube-api-access-zqjqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:46:09.491052 master-0 kubenswrapper[23041]: I0308 00:46:09.491023 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util\") pod \"bf5cf42d-f955-443d-985a-4f3463067b7e\" (UID: \"bf5cf42d-f955-443d-985a-4f3463067b7e\") " Mar 08 00:46:09.491147 master-0 kubenswrapper[23041]: I0308 00:46:09.491123 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle\") pod \"b43c2152-0b42-449e-8649-44b77a0affb4\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " Mar 08 00:46:09.491247 master-0 kubenswrapper[23041]: I0308 00:46:09.491225 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v52qt\" (UniqueName: \"kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt\") pod \"b43c2152-0b42-449e-8649-44b77a0affb4\" (UID: \"b43c2152-0b42-449e-8649-44b77a0affb4\") " Mar 08 00:46:09.491850 master-0 kubenswrapper[23041]: I0308 00:46:09.491814 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.491850 master-0 kubenswrapper[23041]: I0308 00:46:09.491842 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjqg\" (UniqueName: \"kubernetes.io/projected/bf5cf42d-f955-443d-985a-4f3463067b7e-kube-api-access-zqjqg\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.492964 master-0 kubenswrapper[23041]: I0308 00:46:09.492918 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle" (OuterVolumeSpecName: "bundle") pod "bf5cf42d-f955-443d-985a-4f3463067b7e" (UID: "bf5cf42d-f955-443d-985a-4f3463067b7e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.494717 master-0 kubenswrapper[23041]: I0308 00:46:09.494412 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt" (OuterVolumeSpecName: "kube-api-access-v52qt") pod "b43c2152-0b42-449e-8649-44b77a0affb4" (UID: "b43c2152-0b42-449e-8649-44b77a0affb4"). InnerVolumeSpecName "kube-api-access-v52qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:46:09.495111 master-0 kubenswrapper[23041]: I0308 00:46:09.495033 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947" (OuterVolumeSpecName: "kube-api-access-rz947") pod "87ad2056-7e0e-4c58-997a-50f86cf2384a" (UID: "87ad2056-7e0e-4c58-997a-50f86cf2384a"). InnerVolumeSpecName "kube-api-access-rz947". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:46:09.497432 master-0 kubenswrapper[23041]: I0308 00:46:09.497376 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle" (OuterVolumeSpecName: "bundle") pod "b43c2152-0b42-449e-8649-44b77a0affb4" (UID: "b43c2152-0b42-449e-8649-44b77a0affb4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.506764 master-0 kubenswrapper[23041]: I0308 00:46:09.506724 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util" (OuterVolumeSpecName: "util") pod "87ad2056-7e0e-4c58-997a-50f86cf2384a" (UID: "87ad2056-7e0e-4c58-997a-50f86cf2384a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.507088 master-0 kubenswrapper[23041]: I0308 00:46:09.507042 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util" (OuterVolumeSpecName: "util") pod "bf5cf42d-f955-443d-985a-4f3463067b7e" (UID: "bf5cf42d-f955-443d-985a-4f3463067b7e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.508881 master-0 kubenswrapper[23041]: I0308 00:46:09.508853 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util" (OuterVolumeSpecName: "util") pod "b43c2152-0b42-449e-8649-44b77a0affb4" (UID: "b43c2152-0b42-449e-8649-44b77a0affb4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:09.593009 master-0 kubenswrapper[23041]: I0308 00:46:09.592941 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593028 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz947\" (UniqueName: \"kubernetes.io/projected/87ad2056-7e0e-4c58-997a-50f86cf2384a-kube-api-access-rz947\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593046 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593060 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ad2056-7e0e-4c58-997a-50f86cf2384a-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593072 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf42d-f955-443d-985a-4f3463067b7e-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593084 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b43c2152-0b42-449e-8649-44b77a0affb4-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.593170 master-0 kubenswrapper[23041]: I0308 00:46:09.593096 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v52qt\" (UniqueName: \"kubernetes.io/projected/b43c2152-0b42-449e-8649-44b77a0affb4-kube-api-access-v52qt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:09.885660 master-0 kubenswrapper[23041]: I0308 00:46:09.885598 23041 generic.go:334] "Generic (PLEG): container finished" podID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerID="49269f048fd67e6578e1c8d08129e8509bed6f06c5cd96773d759ab1ac0333ab" exitCode=0 Mar 08 00:46:09.885660 master-0 kubenswrapper[23041]: I0308 00:46:09.885670 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" event={"ID":"fe65d415-e10c-4257-a9d8-d392c20f2a9d","Type":"ContainerDied","Data":"49269f048fd67e6578e1c8d08129e8509bed6f06c5cd96773d759ab1ac0333ab"} Mar 08 00:46:09.888269 master-0 kubenswrapper[23041]: I0308 00:46:09.887938 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" event={"ID":"bf5cf42d-f955-443d-985a-4f3463067b7e","Type":"ContainerDied","Data":"463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a"} Mar 08 00:46:09.888269 master-0 kubenswrapper[23041]: I0308 00:46:09.887987 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463bcb37f094c134b9d81cb9b8c02a4376f4653bed6ea22c07e5b430e4d3b43a" Mar 08 00:46:09.888269 master-0 kubenswrapper[23041]: I0308 00:46:09.888010 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xqr76" Mar 08 00:46:09.893514 master-0 kubenswrapper[23041]: I0308 00:46:09.893479 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" event={"ID":"87ad2056-7e0e-4c58-997a-50f86cf2384a","Type":"ContainerDied","Data":"de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2"} Mar 08 00:46:09.893514 master-0 kubenswrapper[23041]: I0308 00:46:09.893515 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3905058b4f2f25675d0c8c5cebcf9078c7b02cd7d3b5aff253d39f2f8174b2" Mar 08 00:46:09.893690 master-0 kubenswrapper[23041]: I0308 00:46:09.893661 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54k84h" Mar 08 00:46:09.897788 master-0 kubenswrapper[23041]: I0308 00:46:09.897741 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" event={"ID":"b43c2152-0b42-449e-8649-44b77a0affb4","Type":"ContainerDied","Data":"fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243"} Mar 08 00:46:09.898025 master-0 kubenswrapper[23041]: I0308 00:46:09.897923 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa395f20408e9f3a8044ea42f86431bb95a77981f4297058c39d3216452b2243" Mar 08 00:46:09.898025 master-0 kubenswrapper[23041]: I0308 00:46:09.897784 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4z6tq8" Mar 08 00:46:10.906216 master-0 kubenswrapper[23041]: I0308 00:46:10.906127 23041 generic.go:334] "Generic (PLEG): container finished" podID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerID="1807d94541fb6451285e71652763fa9469396334a7f693fa2c86240c013d4fdb" exitCode=0 Mar 08 00:46:10.906216 master-0 kubenswrapper[23041]: I0308 00:46:10.906179 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" event={"ID":"fe65d415-e10c-4257-a9d8-d392c20f2a9d","Type":"ContainerDied","Data":"1807d94541fb6451285e71652763fa9469396334a7f693fa2c86240c013d4fdb"} Mar 08 00:46:12.289968 master-0 kubenswrapper[23041]: I0308 00:46:12.289925 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:12.348332 master-0 kubenswrapper[23041]: I0308 00:46:12.347603 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util\") pod \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " Mar 08 00:46:12.348332 master-0 kubenswrapper[23041]: I0308 00:46:12.347766 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle\") pod \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " Mar 08 00:46:12.348332 master-0 kubenswrapper[23041]: I0308 00:46:12.347872 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr4n9\" (UniqueName: \"kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9\") pod \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\" (UID: \"fe65d415-e10c-4257-a9d8-d392c20f2a9d\") " Mar 08 00:46:12.357277 master-0 kubenswrapper[23041]: I0308 00:46:12.356442 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle" (OuterVolumeSpecName: "bundle") pod "fe65d415-e10c-4257-a9d8-d392c20f2a9d" (UID: "fe65d415-e10c-4257-a9d8-d392c20f2a9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:12.360858 master-0 kubenswrapper[23041]: I0308 00:46:12.360813 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9" (OuterVolumeSpecName: "kube-api-access-lr4n9") pod "fe65d415-e10c-4257-a9d8-d392c20f2a9d" (UID: "fe65d415-e10c-4257-a9d8-d392c20f2a9d"). InnerVolumeSpecName "kube-api-access-lr4n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:46:12.387337 master-0 kubenswrapper[23041]: I0308 00:46:12.387220 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util" (OuterVolumeSpecName: "util") pod "fe65d415-e10c-4257-a9d8-d392c20f2a9d" (UID: "fe65d415-e10c-4257-a9d8-d392c20f2a9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:46:12.450177 master-0 kubenswrapper[23041]: I0308 00:46:12.449304 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:12.450177 master-0 kubenswrapper[23041]: I0308 00:46:12.449400 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr4n9\" (UniqueName: \"kubernetes.io/projected/fe65d415-e10c-4257-a9d8-d392c20f2a9d-kube-api-access-lr4n9\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:12.450177 master-0 kubenswrapper[23041]: I0308 00:46:12.449416 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe65d415-e10c-4257-a9d8-d392c20f2a9d-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:46:12.919674 master-0 kubenswrapper[23041]: I0308 00:46:12.919602 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" event={"ID":"fe65d415-e10c-4257-a9d8-d392c20f2a9d","Type":"ContainerDied","Data":"e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc"} Mar 08 00:46:12.919674 master-0 kubenswrapper[23041]: I0308 00:46:12.919650 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f6625f02187ca6bf0097e19c07cb132df00197df0f32fe811d67fc9db07dbc" Mar 08 00:46:12.919674 master-0 kubenswrapper[23041]: I0308 00:46:12.919674 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f089lppz" Mar 08 00:46:15.132102 master-0 kubenswrapper[23041]: I0308 00:46:15.132039 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78"] Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132352 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132366 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132375 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132381 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132395 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132402 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132410 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132416 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132430 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132436 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132447 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132452 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132460 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132466 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132480 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132486 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132494 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132500 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132524 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132542 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="util" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132551 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132557 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="pull" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: E0308 00:46:15.132566 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132571 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132706 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5cf42d-f955-443d-985a-4f3463067b7e" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132723 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ad2056-7e0e-4c58-997a-50f86cf2384a" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132744 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe65d415-e10c-4257-a9d8-d392c20f2a9d" containerName="extract" Mar 08 00:46:15.132849 master-0 kubenswrapper[23041]: I0308 00:46:15.132758 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43c2152-0b42-449e-8649-44b77a0affb4" containerName="extract" Mar 08 00:46:15.134048 master-0 kubenswrapper[23041]: I0308 00:46:15.133233 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.134886 master-0 kubenswrapper[23041]: I0308 00:46:15.134599 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 00:46:15.135070 master-0 kubenswrapper[23041]: I0308 00:46:15.135032 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 00:46:15.154996 master-0 kubenswrapper[23041]: I0308 00:46:15.154928 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78"] Mar 08 00:46:15.200227 master-0 kubenswrapper[23041]: I0308 00:46:15.199722 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c05afc7-833d-4360-a6ef-12d897480e98-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.200227 master-0 kubenswrapper[23041]: I0308 00:46:15.199837 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmjb\" (UniqueName: \"kubernetes.io/projected/8c05afc7-833d-4360-a6ef-12d897480e98-kube-api-access-cnmjb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.301166 master-0 kubenswrapper[23041]: I0308 00:46:15.301101 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c05afc7-833d-4360-a6ef-12d897480e98-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.301408 master-0 kubenswrapper[23041]: I0308 00:46:15.301259 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmjb\" (UniqueName: \"kubernetes.io/projected/8c05afc7-833d-4360-a6ef-12d897480e98-kube-api-access-cnmjb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.301721 master-0 kubenswrapper[23041]: I0308 00:46:15.301678 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c05afc7-833d-4360-a6ef-12d897480e98-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.322982 master-0 kubenswrapper[23041]: I0308 00:46:15.322944 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmjb\" (UniqueName: \"kubernetes.io/projected/8c05afc7-833d-4360-a6ef-12d897480e98-kube-api-access-cnmjb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-kbq78\" (UID: \"8c05afc7-833d-4360-a6ef-12d897480e98\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.451684 master-0 kubenswrapper[23041]: I0308 00:46:15.451557 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" Mar 08 00:46:15.985355 master-0 kubenswrapper[23041]: I0308 00:46:15.984320 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78"] Mar 08 00:46:16.004798 master-0 kubenswrapper[23041]: W0308 00:46:16.004741 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c05afc7_833d_4360_a6ef_12d897480e98.slice/crio-40b63444ea11ee863ba2cbf226630876f4413f6a949e7381532de845f92cfe98 WatchSource:0}: Error finding container 40b63444ea11ee863ba2cbf226630876f4413f6a949e7381532de845f92cfe98: Status 404 returned error can't find the container with id 40b63444ea11ee863ba2cbf226630876f4413f6a949e7381532de845f92cfe98 Mar 08 00:46:16.950873 master-0 kubenswrapper[23041]: I0308 00:46:16.950812 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" event={"ID":"8c05afc7-833d-4360-a6ef-12d897480e98","Type":"ContainerStarted","Data":"40b63444ea11ee863ba2cbf226630876f4413f6a949e7381532de845f92cfe98"} Mar 08 00:46:22.728912 master-0 kubenswrapper[23041]: I0308 00:46:22.728851 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" event={"ID":"8c05afc7-833d-4360-a6ef-12d897480e98","Type":"ContainerStarted","Data":"0196a0805e132da40f0042efa0da436fbf909d4a9dfab8566a8f02536b4003ac"} Mar 08 00:46:22.769823 master-0 kubenswrapper[23041]: I0308 00:46:22.769725 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-kbq78" podStartSLOduration=2.249804166 podStartE2EDuration="7.769702528s" podCreationTimestamp="2026-03-08 00:46:15 +0000 UTC" firstStartedPulling="2026-03-08 00:46:16.008249088 +0000 UTC m=+881.481085642" lastFinishedPulling="2026-03-08 00:46:21.52814744 +0000 UTC m=+887.000984004" observedRunningTime="2026-03-08 00:46:22.7605632 +0000 UTC m=+888.233399754" watchObservedRunningTime="2026-03-08 00:46:22.769702528 +0000 UTC m=+888.242539082" Mar 08 00:46:24.875520 master-0 kubenswrapper[23041]: I0308 00:46:24.875442 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lzq2v"] Mar 08 00:46:24.876417 master-0 kubenswrapper[23041]: I0308 00:46:24.876387 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:24.884606 master-0 kubenswrapper[23041]: I0308 00:46:24.884552 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 00:46:24.884879 master-0 kubenswrapper[23041]: I0308 00:46:24.884830 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 00:46:24.900481 master-0 kubenswrapper[23041]: I0308 00:46:24.900417 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lzq2v"] Mar 08 00:46:24.941227 master-0 kubenswrapper[23041]: I0308 00:46:24.940972 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfksq\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-kube-api-access-zfksq\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:24.941227 master-0 kubenswrapper[23041]: I0308 00:46:24.941110 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.043115 master-0 kubenswrapper[23041]: I0308 00:46:25.043044 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.043331 master-0 kubenswrapper[23041]: I0308 00:46:25.043171 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfksq\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-kube-api-access-zfksq\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.076401 master-0 kubenswrapper[23041]: I0308 00:46:25.076361 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.087430 master-0 kubenswrapper[23041]: I0308 00:46:25.084543 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfksq\" (UniqueName: \"kubernetes.io/projected/5fdcf562-ce16-409a-a8d7-ac64e860cc26-kube-api-access-zfksq\") pod \"cert-manager-webhook-6888856db4-lzq2v\" (UID: \"5fdcf562-ce16-409a-a8d7-ac64e860cc26\") " pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.192907 master-0 kubenswrapper[23041]: I0308 00:46:25.192784 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:25.643991 master-0 kubenswrapper[23041]: I0308 00:46:25.643912 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-lzq2v"] Mar 08 00:46:25.755225 master-0 kubenswrapper[23041]: I0308 00:46:25.754645 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" event={"ID":"5fdcf562-ce16-409a-a8d7-ac64e860cc26","Type":"ContainerStarted","Data":"34b06002de5b52382a1fef6954d803a271eefe308d1d141d67d6ed253a384c0e"} Mar 08 00:46:27.165855 master-0 kubenswrapper[23041]: I0308 00:46:27.158673 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-vc4q5"] Mar 08 00:46:27.165855 master-0 kubenswrapper[23041]: I0308 00:46:27.159942 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.194001 master-0 kubenswrapper[23041]: I0308 00:46:27.193941 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-vc4q5"] Mar 08 00:46:27.293558 master-0 kubenswrapper[23041]: I0308 00:46:27.293495 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k67st\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-kube-api-access-k67st\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.293558 master-0 kubenswrapper[23041]: I0308 00:46:27.293557 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.395604 master-0 kubenswrapper[23041]: I0308 00:46:27.395519 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k67st\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-kube-api-access-k67st\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.395604 master-0 kubenswrapper[23041]: I0308 00:46:27.395591 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.423511 master-0 kubenswrapper[23041]: I0308 00:46:27.422459 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k67st\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-kube-api-access-k67st\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.428389 master-0 kubenswrapper[23041]: I0308 00:46:27.426545 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809cf7c2-3d55-44c8-ace4-a3f27244d1a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-vc4q5\" (UID: \"809cf7c2-3d55-44c8-ace4-a3f27244d1a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.483863 master-0 kubenswrapper[23041]: I0308 00:46:27.483802 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" Mar 08 00:46:27.976652 master-0 kubenswrapper[23041]: I0308 00:46:27.976576 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-vc4q5"] Mar 08 00:46:27.992225 master-0 kubenswrapper[23041]: W0308 00:46:27.986582 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod809cf7c2_3d55_44c8_ace4_a3f27244d1a7.slice/crio-f717432262b7a0046015eac6b260db0e16ea95c6d65ce165ef953b7da0891e44 WatchSource:0}: Error finding container f717432262b7a0046015eac6b260db0e16ea95c6d65ce165ef953b7da0891e44: Status 404 returned error can't find the container with id f717432262b7a0046015eac6b260db0e16ea95c6d65ce165ef953b7da0891e44 Mar 08 00:46:28.802224 master-0 kubenswrapper[23041]: I0308 00:46:28.801510 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" event={"ID":"809cf7c2-3d55-44c8-ace4-a3f27244d1a7","Type":"ContainerStarted","Data":"f717432262b7a0046015eac6b260db0e16ea95c6d65ce165ef953b7da0891e44"} Mar 08 00:46:30.108226 master-0 kubenswrapper[23041]: I0308 00:46:30.105506 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr"] Mar 08 00:46:30.108226 master-0 kubenswrapper[23041]: I0308 00:46:30.106498 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" Mar 08 00:46:30.109762 master-0 kubenswrapper[23041]: I0308 00:46:30.109658 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 00:46:30.114222 master-0 kubenswrapper[23041]: I0308 00:46:30.109917 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 00:46:30.123226 master-0 kubenswrapper[23041]: I0308 00:46:30.121134 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr"] Mar 08 00:46:30.194040 master-0 kubenswrapper[23041]: I0308 00:46:30.184665 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spkdf\" (UniqueName: \"kubernetes.io/projected/c655a8f2-359b-449c-96f6-1b34aa0ca204-kube-api-access-spkdf\") pod \"nmstate-operator-75c5dccd6c-qs2gr\" (UID: \"c655a8f2-359b-449c-96f6-1b34aa0ca204\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" Mar 08 00:46:30.286347 master-0 kubenswrapper[23041]: I0308 00:46:30.286273 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkdf\" (UniqueName: \"kubernetes.io/projected/c655a8f2-359b-449c-96f6-1b34aa0ca204-kube-api-access-spkdf\") pod \"nmstate-operator-75c5dccd6c-qs2gr\" (UID: \"c655a8f2-359b-449c-96f6-1b34aa0ca204\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" Mar 08 00:46:30.309769 master-0 kubenswrapper[23041]: I0308 00:46:30.309004 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spkdf\" (UniqueName: \"kubernetes.io/projected/c655a8f2-359b-449c-96f6-1b34aa0ca204-kube-api-access-spkdf\") pod \"nmstate-operator-75c5dccd6c-qs2gr\" (UID: \"c655a8f2-359b-449c-96f6-1b34aa0ca204\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" Mar 08 00:46:30.436991 master-0 kubenswrapper[23041]: I0308 00:46:30.436659 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" Mar 08 00:46:31.918068 master-0 kubenswrapper[23041]: I0308 00:46:31.917770 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9"] Mar 08 00:46:31.918773 master-0 kubenswrapper[23041]: I0308 00:46:31.918741 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:31.925234 master-0 kubenswrapper[23041]: I0308 00:46:31.923148 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 00:46:31.925234 master-0 kubenswrapper[23041]: I0308 00:46:31.923573 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 00:46:31.925234 master-0 kubenswrapper[23041]: I0308 00:46:31.923743 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 00:46:31.925234 master-0 kubenswrapper[23041]: I0308 00:46:31.924311 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 00:46:31.940228 master-0 kubenswrapper[23041]: I0308 00:46:31.938108 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9"] Mar 08 00:46:32.066793 master-0 kubenswrapper[23041]: I0308 00:46:32.066739 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrcl\" (UniqueName: \"kubernetes.io/projected/91aedad8-165a-49f5-9aa2-e87be58d353e-kube-api-access-8mrcl\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.067005 master-0 kubenswrapper[23041]: I0308 00:46:32.066808 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-webhook-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.067005 master-0 kubenswrapper[23041]: I0308 00:46:32.066899 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-apiservice-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.169119 master-0 kubenswrapper[23041]: I0308 00:46:32.168992 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrcl\" (UniqueName: \"kubernetes.io/projected/91aedad8-165a-49f5-9aa2-e87be58d353e-kube-api-access-8mrcl\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.169119 master-0 kubenswrapper[23041]: I0308 00:46:32.169059 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-webhook-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.169119 master-0 kubenswrapper[23041]: I0308 00:46:32.169115 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-apiservice-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.179241 master-0 kubenswrapper[23041]: I0308 00:46:32.177123 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-webhook-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.182961 master-0 kubenswrapper[23041]: I0308 00:46:32.182920 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91aedad8-165a-49f5-9aa2-e87be58d353e-apiservice-cert\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.205245 master-0 kubenswrapper[23041]: I0308 00:46:32.204566 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrcl\" (UniqueName: \"kubernetes.io/projected/91aedad8-165a-49f5-9aa2-e87be58d353e-kube-api-access-8mrcl\") pod \"metallb-operator-controller-manager-86db79fc85-g44m9\" (UID: \"91aedad8-165a-49f5-9aa2-e87be58d353e\") " pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.259012 master-0 kubenswrapper[23041]: I0308 00:46:32.258235 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:32.784411 master-0 kubenswrapper[23041]: I0308 00:46:32.784335 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd"] Mar 08 00:46:32.785365 master-0 kubenswrapper[23041]: I0308 00:46:32.785334 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.800637 master-0 kubenswrapper[23041]: I0308 00:46:32.800564 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 00:46:32.800938 master-0 kubenswrapper[23041]: I0308 00:46:32.800911 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 00:46:32.873193 master-0 kubenswrapper[23041]: I0308 00:46:32.873121 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd"] Mar 08 00:46:32.893225 master-0 kubenswrapper[23041]: I0308 00:46:32.890776 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-webhook-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.893225 master-0 kubenswrapper[23041]: I0308 00:46:32.890866 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qwh\" (UniqueName: \"kubernetes.io/projected/2d5f2073-ca3c-42a0-820f-25a949718639-kube-api-access-n9qwh\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.893225 master-0 kubenswrapper[23041]: I0308 00:46:32.890928 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.992229 master-0 kubenswrapper[23041]: I0308 00:46:32.992144 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9qwh\" (UniqueName: \"kubernetes.io/projected/2d5f2073-ca3c-42a0-820f-25a949718639-kube-api-access-n9qwh\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.992869 master-0 kubenswrapper[23041]: I0308 00:46:32.992273 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.992869 master-0 kubenswrapper[23041]: I0308 00:46:32.992325 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-webhook-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:32.997776 master-0 kubenswrapper[23041]: I0308 00:46:32.997747 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-webhook-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:33.005881 master-0 kubenswrapper[23041]: I0308 00:46:33.005839 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d5f2073-ca3c-42a0-820f-25a949718639-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:33.606783 master-0 kubenswrapper[23041]: I0308 00:46:33.606603 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9qwh\" (UniqueName: \"kubernetes.io/projected/2d5f2073-ca3c-42a0-820f-25a949718639-kube-api-access-n9qwh\") pod \"metallb-operator-webhook-server-5bc86b5b94-cmsdd\" (UID: \"2d5f2073-ca3c-42a0-820f-25a949718639\") " pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:33.753042 master-0 kubenswrapper[23041]: I0308 00:46:33.752975 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:34.911402 master-0 kubenswrapper[23041]: I0308 00:46:34.910965 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" event={"ID":"5fdcf562-ce16-409a-a8d7-ac64e860cc26","Type":"ContainerStarted","Data":"1b50938ea412c27d32e3a6db89977900e478b707f501c3b43ddfa144f0201bd8"} Mar 08 00:46:34.911402 master-0 kubenswrapper[23041]: I0308 00:46:34.911070 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:34.914038 master-0 kubenswrapper[23041]: I0308 00:46:34.913856 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" event={"ID":"809cf7c2-3d55-44c8-ace4-a3f27244d1a7","Type":"ContainerStarted","Data":"ea667195595736788e0f5692d8744de74148847edaf3a044f27809a9085461fb"} Mar 08 00:46:34.939745 master-0 kubenswrapper[23041]: I0308 00:46:34.939471 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" podStartSLOduration=2.117275349 podStartE2EDuration="10.939443671s" podCreationTimestamp="2026-03-08 00:46:24 +0000 UTC" firstStartedPulling="2026-03-08 00:46:25.643146729 +0000 UTC m=+891.115983283" lastFinishedPulling="2026-03-08 00:46:34.465315051 +0000 UTC m=+899.938151605" observedRunningTime="2026-03-08 00:46:34.931604552 +0000 UTC m=+900.404441106" watchObservedRunningTime="2026-03-08 00:46:34.939443671 +0000 UTC m=+900.412280235" Mar 08 00:46:34.963305 master-0 kubenswrapper[23041]: I0308 00:46:34.963218 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-vc4q5" podStartSLOduration=1.384387571 podStartE2EDuration="7.963180274s" podCreationTimestamp="2026-03-08 00:46:27 +0000 UTC" firstStartedPulling="2026-03-08 00:46:27.989661964 +0000 UTC m=+893.462498518" lastFinishedPulling="2026-03-08 00:46:34.568454667 +0000 UTC m=+900.041291221" observedRunningTime="2026-03-08 00:46:34.962425757 +0000 UTC m=+900.435262311" watchObservedRunningTime="2026-03-08 00:46:34.963180274 +0000 UTC m=+900.436016838" Mar 08 00:46:35.080311 master-0 kubenswrapper[23041]: I0308 00:46:35.079314 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd"] Mar 08 00:46:35.222226 master-0 kubenswrapper[23041]: I0308 00:46:35.221374 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9"] Mar 08 00:46:35.238574 master-0 kubenswrapper[23041]: I0308 00:46:35.236069 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr"] Mar 08 00:46:35.240244 master-0 kubenswrapper[23041]: W0308 00:46:35.238647 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc655a8f2_359b_449c_96f6_1b34aa0ca204.slice/crio-f576741d8130db4b708a20e4541b6663357e92ad0f1db7e541b690cb43b1d198 WatchSource:0}: Error finding container f576741d8130db4b708a20e4541b6663357e92ad0f1db7e541b690cb43b1d198: Status 404 returned error can't find the container with id f576741d8130db4b708a20e4541b6663357e92ad0f1db7e541b690cb43b1d198 Mar 08 00:46:35.922621 master-0 kubenswrapper[23041]: I0308 00:46:35.922544 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" event={"ID":"2d5f2073-ca3c-42a0-820f-25a949718639","Type":"ContainerStarted","Data":"9a19aafb7136eaf133bd01ed9cc4caf2bf56a3a009de9150a90fffc268a25c76"} Mar 08 00:46:35.924397 master-0 kubenswrapper[23041]: I0308 00:46:35.924339 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" event={"ID":"c655a8f2-359b-449c-96f6-1b34aa0ca204","Type":"ContainerStarted","Data":"f576741d8130db4b708a20e4541b6663357e92ad0f1db7e541b690cb43b1d198"} Mar 08 00:46:35.925875 master-0 kubenswrapper[23041]: I0308 00:46:35.925848 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" event={"ID":"91aedad8-165a-49f5-9aa2-e87be58d353e","Type":"ContainerStarted","Data":"67aa560aca3e6e4ee05d879c6b080bb43b940cab685f23e10af2b5f1f648a564"} Mar 08 00:46:38.742240 master-0 kubenswrapper[23041]: I0308 00:46:38.741622 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-8h4v6"] Mar 08 00:46:38.742804 master-0 kubenswrapper[23041]: I0308 00:46:38.742709 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:38.778474 master-0 kubenswrapper[23041]: I0308 00:46:38.778388 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8h4v6"] Mar 08 00:46:38.876915 master-0 kubenswrapper[23041]: I0308 00:46:38.876857 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xkp\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-kube-api-access-65xkp\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:38.877183 master-0 kubenswrapper[23041]: I0308 00:46:38.876965 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-bound-sa-token\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:38.981526 master-0 kubenswrapper[23041]: I0308 00:46:38.978765 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-bound-sa-token\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:38.981526 master-0 kubenswrapper[23041]: I0308 00:46:38.978945 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65xkp\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-kube-api-access-65xkp\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:39.002777 master-0 kubenswrapper[23041]: I0308 00:46:39.002706 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xkp\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-kube-api-access-65xkp\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:39.007835 master-0 kubenswrapper[23041]: I0308 00:46:39.007810 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c115e48-294c-4e9d-aaaa-028b464c9e85-bound-sa-token\") pod \"cert-manager-545d4d4674-8h4v6\" (UID: \"4c115e48-294c-4e9d-aaaa-028b464c9e85\") " pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:39.073073 master-0 kubenswrapper[23041]: I0308 00:46:39.072994 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8h4v6" Mar 08 00:46:40.058224 master-0 kubenswrapper[23041]: I0308 00:46:40.053274 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5"] Mar 08 00:46:40.058224 master-0 kubenswrapper[23041]: I0308 00:46:40.054475 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" Mar 08 00:46:40.058224 master-0 kubenswrapper[23041]: I0308 00:46:40.056928 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 00:46:40.058871 master-0 kubenswrapper[23041]: I0308 00:46:40.058619 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 00:46:40.076511 master-0 kubenswrapper[23041]: I0308 00:46:40.075771 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5"] Mar 08 00:46:40.179882 master-0 kubenswrapper[23041]: I0308 00:46:40.178465 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x"] Mar 08 00:46:40.180109 master-0 kubenswrapper[23041]: I0308 00:46:40.179649 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.183353 master-0 kubenswrapper[23041]: I0308 00:46:40.182594 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 00:46:40.199443 master-0 kubenswrapper[23041]: I0308 00:46:40.197455 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq"] Mar 08 00:46:40.199443 master-0 kubenswrapper[23041]: I0308 00:46:40.198368 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.199443 master-0 kubenswrapper[23041]: I0308 00:46:40.198894 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-lzq2v" Mar 08 00:46:40.205905 master-0 kubenswrapper[23041]: I0308 00:46:40.205732 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x"] Mar 08 00:46:40.213878 master-0 kubenswrapper[23041]: I0308 00:46:40.213817 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq"] Mar 08 00:46:40.214716 master-0 kubenswrapper[23041]: I0308 00:46:40.214407 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.214716 master-0 kubenswrapper[23041]: I0308 00:46:40.214454 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.214716 master-0 kubenswrapper[23041]: I0308 00:46:40.214477 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.214716 master-0 kubenswrapper[23041]: I0308 00:46:40.214538 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clfhn\" (UniqueName: \"kubernetes.io/projected/1683d6f8-ac08-41c3-b6a3-be9d61231bb6-kube-api-access-clfhn\") pod \"obo-prometheus-operator-68bc856cb9-lzbg5\" (UID: \"1683d6f8-ac08-41c3-b6a3-be9d61231bb6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" Mar 08 00:46:40.214716 master-0 kubenswrapper[23041]: I0308 00:46:40.214561 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.316285 master-0 kubenswrapper[23041]: I0308 00:46:40.316152 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.316537 master-0 kubenswrapper[23041]: I0308 00:46:40.316519 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.316638 master-0 kubenswrapper[23041]: I0308 00:46:40.316623 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.316823 master-0 kubenswrapper[23041]: I0308 00:46:40.316807 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clfhn\" (UniqueName: \"kubernetes.io/projected/1683d6f8-ac08-41c3-b6a3-be9d61231bb6-kube-api-access-clfhn\") pod \"obo-prometheus-operator-68bc856cb9-lzbg5\" (UID: \"1683d6f8-ac08-41c3-b6a3-be9d61231bb6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" Mar 08 00:46:40.316905 master-0 kubenswrapper[23041]: I0308 00:46:40.316892 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.325055 master-0 kubenswrapper[23041]: I0308 00:46:40.324886 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.325055 master-0 kubenswrapper[23041]: I0308 00:46:40.324882 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0de08e1-7a01-443b-b705-ad74efbe8ef7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x\" (UID: \"e0de08e1-7a01-443b-b705-ad74efbe8ef7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.330700 master-0 kubenswrapper[23041]: I0308 00:46:40.328669 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.331477 master-0 kubenswrapper[23041]: I0308 00:46:40.331405 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq\" (UID: \"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.342968 master-0 kubenswrapper[23041]: I0308 00:46:40.342904 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clfhn\" (UniqueName: \"kubernetes.io/projected/1683d6f8-ac08-41c3-b6a3-be9d61231bb6-kube-api-access-clfhn\") pod \"obo-prometheus-operator-68bc856cb9-lzbg5\" (UID: \"1683d6f8-ac08-41c3-b6a3-be9d61231bb6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" Mar 08 00:46:40.387121 master-0 kubenswrapper[23041]: I0308 00:46:40.387055 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7ldjw"] Mar 08 00:46:40.388652 master-0 kubenswrapper[23041]: I0308 00:46:40.388596 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.393092 master-0 kubenswrapper[23041]: I0308 00:46:40.392736 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 00:46:40.405109 master-0 kubenswrapper[23041]: I0308 00:46:40.404557 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" Mar 08 00:46:40.424235 master-0 kubenswrapper[23041]: I0308 00:46:40.423657 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7ldjw"] Mar 08 00:46:40.515345 master-0 kubenswrapper[23041]: I0308 00:46:40.510482 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" Mar 08 00:46:40.537187 master-0 kubenswrapper[23041]: I0308 00:46:40.536777 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rm2zk"] Mar 08 00:46:40.540301 master-0 kubenswrapper[23041]: I0308 00:46:40.540247 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlh4\" (UniqueName: \"kubernetes.io/projected/c9a83886-cd95-45fe-961f-7a273313b310-kube-api-access-dzlh4\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.540445 master-0 kubenswrapper[23041]: I0308 00:46:40.540408 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9a83886-cd95-45fe-961f-7a273313b310-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.546212 master-0 kubenswrapper[23041]: I0308 00:46:40.545674 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" Mar 08 00:46:40.559275 master-0 kubenswrapper[23041]: I0308 00:46:40.558331 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rm2zk"] Mar 08 00:46:40.559275 master-0 kubenswrapper[23041]: I0308 00:46:40.558517 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.648241 master-0 kubenswrapper[23041]: I0308 00:46:40.648171 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlh4\" (UniqueName: \"kubernetes.io/projected/c9a83886-cd95-45fe-961f-7a273313b310-kube-api-access-dzlh4\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.648241 master-0 kubenswrapper[23041]: I0308 00:46:40.648245 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkr9w\" (UniqueName: \"kubernetes.io/projected/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-kube-api-access-bkr9w\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.648555 master-0 kubenswrapper[23041]: I0308 00:46:40.648523 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.648629 master-0 kubenswrapper[23041]: I0308 00:46:40.648589 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9a83886-cd95-45fe-961f-7a273313b310-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.658280 master-0 kubenswrapper[23041]: I0308 00:46:40.653147 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9a83886-cd95-45fe-961f-7a273313b310-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.667894 master-0 kubenswrapper[23041]: I0308 00:46:40.667846 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlh4\" (UniqueName: \"kubernetes.io/projected/c9a83886-cd95-45fe-961f-7a273313b310-kube-api-access-dzlh4\") pod \"observability-operator-59bdc8b94-7ldjw\" (UID: \"c9a83886-cd95-45fe-961f-7a273313b310\") " pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.731794 master-0 kubenswrapper[23041]: I0308 00:46:40.730589 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:40.750381 master-0 kubenswrapper[23041]: I0308 00:46:40.750333 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkr9w\" (UniqueName: \"kubernetes.io/projected/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-kube-api-access-bkr9w\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.750575 master-0 kubenswrapper[23041]: I0308 00:46:40.750528 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.751413 master-0 kubenswrapper[23041]: I0308 00:46:40.751370 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.773144 master-0 kubenswrapper[23041]: I0308 00:46:40.773089 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkr9w\" (UniqueName: \"kubernetes.io/projected/a6baaf31-c5c2-4642-abbd-a0fd0b48528b-kube-api-access-bkr9w\") pod \"perses-operator-5bf474d74f-rm2zk\" (UID: \"a6baaf31-c5c2-4642-abbd-a0fd0b48528b\") " pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:40.910284 master-0 kubenswrapper[23041]: I0308 00:46:40.910096 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:43.484273 master-0 kubenswrapper[23041]: I0308 00:46:43.484190 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7ldjw"] Mar 08 00:46:43.666800 master-0 kubenswrapper[23041]: I0308 00:46:43.666738 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq"] Mar 08 00:46:43.959634 master-0 kubenswrapper[23041]: I0308 00:46:43.958117 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8h4v6"] Mar 08 00:46:43.991511 master-0 kubenswrapper[23041]: I0308 00:46:43.990433 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rm2zk"] Mar 08 00:46:44.058267 master-0 kubenswrapper[23041]: I0308 00:46:44.055468 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" event={"ID":"91aedad8-165a-49f5-9aa2-e87be58d353e","Type":"ContainerStarted","Data":"93ccd58ac97265b64041642c3d2d2c3e1eaf93fae8581e06489328e1c5d37818"} Mar 08 00:46:44.058267 master-0 kubenswrapper[23041]: I0308 00:46:44.056511 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:46:44.060118 master-0 kubenswrapper[23041]: I0308 00:46:44.060086 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8h4v6" event={"ID":"4c115e48-294c-4e9d-aaaa-028b464c9e85","Type":"ContainerStarted","Data":"096fc665466d0fbdb64355df798ecf02a46b217f254ab44edcd15bfb3fc04c74"} Mar 08 00:46:44.064265 master-0 kubenswrapper[23041]: I0308 00:46:44.063607 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5"] Mar 08 00:46:44.066733 master-0 kubenswrapper[23041]: W0308 00:46:44.066651 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1683d6f8_ac08_41c3_b6a3_be9d61231bb6.slice/crio-ae71b3437d56c752a9c0f387d7587f333fdf754b6d6ca4fb6c3f8ffeabc5a682 WatchSource:0}: Error finding container ae71b3437d56c752a9c0f387d7587f333fdf754b6d6ca4fb6c3f8ffeabc5a682: Status 404 returned error can't find the container with id ae71b3437d56c752a9c0f387d7587f333fdf754b6d6ca4fb6c3f8ffeabc5a682 Mar 08 00:46:44.067331 master-0 kubenswrapper[23041]: I0308 00:46:44.067300 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" event={"ID":"c9a83886-cd95-45fe-961f-7a273313b310","Type":"ContainerStarted","Data":"08194d586bf00b5e4cafa13228aed4cd199265cf83fcf554d7d379a1d6a8cfa0"} Mar 08 00:46:44.095278 master-0 kubenswrapper[23041]: I0308 00:46:44.089227 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x"] Mar 08 00:46:44.113246 master-0 kubenswrapper[23041]: I0308 00:46:44.112446 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" event={"ID":"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4","Type":"ContainerStarted","Data":"aeabe3c7edcbe2164baa7b922cca23286625459fc023d4f92200b1b3a0c91c6d"} Mar 08 00:46:44.113514 master-0 kubenswrapper[23041]: I0308 00:46:44.113424 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" podStartSLOduration=5.409644263 podStartE2EDuration="13.113406121s" podCreationTimestamp="2026-03-08 00:46:31 +0000 UTC" firstStartedPulling="2026-03-08 00:46:35.239081296 +0000 UTC m=+900.711917850" lastFinishedPulling="2026-03-08 00:46:42.942843154 +0000 UTC m=+908.415679708" observedRunningTime="2026-03-08 00:46:44.112361757 +0000 UTC m=+909.585198331" watchObservedRunningTime="2026-03-08 00:46:44.113406121 +0000 UTC m=+909.586242665" Mar 08 00:46:44.119368 master-0 kubenswrapper[23041]: I0308 00:46:44.119284 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" event={"ID":"a6baaf31-c5c2-4642-abbd-a0fd0b48528b","Type":"ContainerStarted","Data":"5b567aee48e1a9d1cf23e87682939f2b9457f5b548e6e435f3075a30d21e9a8d"} Mar 08 00:46:44.146575 master-0 kubenswrapper[23041]: I0308 00:46:44.146504 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" event={"ID":"2d5f2073-ca3c-42a0-820f-25a949718639","Type":"ContainerStarted","Data":"5d5f215d99dafdb6579c0e0c9991907117ef0f333d4b86bc9827bf9336b667ff"} Mar 08 00:46:44.147570 master-0 kubenswrapper[23041]: I0308 00:46:44.147540 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:44.177674 master-0 kubenswrapper[23041]: I0308 00:46:44.177023 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" event={"ID":"c655a8f2-359b-449c-96f6-1b34aa0ca204","Type":"ContainerStarted","Data":"0091e66cd50d3283a74719b492ee613707fbb4bf75570a85e1ce6b78f5be82c5"} Mar 08 00:46:44.202728 master-0 kubenswrapper[23041]: I0308 00:46:44.202633 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" podStartSLOduration=4.345703362 podStartE2EDuration="12.202236129s" podCreationTimestamp="2026-03-08 00:46:32 +0000 UTC" firstStartedPulling="2026-03-08 00:46:35.108395691 +0000 UTC m=+900.581232245" lastFinishedPulling="2026-03-08 00:46:42.964928458 +0000 UTC m=+908.437765012" observedRunningTime="2026-03-08 00:46:44.195870594 +0000 UTC m=+909.668707148" watchObservedRunningTime="2026-03-08 00:46:44.202236129 +0000 UTC m=+909.675072703" Mar 08 00:46:44.840188 master-0 kubenswrapper[23041]: I0308 00:46:44.840087 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qs2gr" podStartSLOduration=7.15257867 podStartE2EDuration="14.840068597s" podCreationTimestamp="2026-03-08 00:46:30 +0000 UTC" firstStartedPulling="2026-03-08 00:46:35.241989692 +0000 UTC m=+900.714826246" lastFinishedPulling="2026-03-08 00:46:42.929479619 +0000 UTC m=+908.402316173" observedRunningTime="2026-03-08 00:46:44.226678578 +0000 UTC m=+909.699515162" watchObservedRunningTime="2026-03-08 00:46:44.840068597 +0000 UTC m=+910.312905151" Mar 08 00:46:45.208456 master-0 kubenswrapper[23041]: I0308 00:46:45.205186 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" event={"ID":"e0de08e1-7a01-443b-b705-ad74efbe8ef7","Type":"ContainerStarted","Data":"464efe0dacd46aec1ab43092c8c0fdfff58e4221ff4bb646425d4d52f7a817db"} Mar 08 00:46:45.210521 master-0 kubenswrapper[23041]: I0308 00:46:45.210459 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" event={"ID":"1683d6f8-ac08-41c3-b6a3-be9d61231bb6","Type":"ContainerStarted","Data":"ae71b3437d56c752a9c0f387d7587f333fdf754b6d6ca4fb6c3f8ffeabc5a682"} Mar 08 00:46:45.219722 master-0 kubenswrapper[23041]: I0308 00:46:45.219670 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8h4v6" event={"ID":"4c115e48-294c-4e9d-aaaa-028b464c9e85","Type":"ContainerStarted","Data":"5beca4ae95c592097f72afc9f68c0a10e42cb1560bcda69eee4a58d84b41ad98"} Mar 08 00:46:45.266279 master-0 kubenswrapper[23041]: I0308 00:46:45.266184 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-8h4v6" podStartSLOduration=7.266164629 podStartE2EDuration="7.266164629s" podCreationTimestamp="2026-03-08 00:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:46:45.261331439 +0000 UTC m=+910.734168003" watchObservedRunningTime="2026-03-08 00:46:45.266164629 +0000 UTC m=+910.739001173" Mar 08 00:46:53.763226 master-0 kubenswrapper[23041]: I0308 00:46:53.762475 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5bc86b5b94-cmsdd" Mar 08 00:46:57.350730 master-0 kubenswrapper[23041]: I0308 00:46:57.350663 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" event={"ID":"e0de08e1-7a01-443b-b705-ad74efbe8ef7","Type":"ContainerStarted","Data":"abdf055793b790fd3e433fb2f4bbe50912ac7466abdb8c5ade5e38e1f8c32306"} Mar 08 00:46:57.352831 master-0 kubenswrapper[23041]: I0308 00:46:57.352778 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" event={"ID":"1683d6f8-ac08-41c3-b6a3-be9d61231bb6","Type":"ContainerStarted","Data":"154365d52547464f933cad7b56b1faf36c8046125df0bf9fc41b2e664379e1f7"} Mar 08 00:46:57.354593 master-0 kubenswrapper[23041]: I0308 00:46:57.354569 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" event={"ID":"c9a83886-cd95-45fe-961f-7a273313b310","Type":"ContainerStarted","Data":"536ea05474b821ba323b323f86bb5d190829b791429d566b9f93f6b519a2171a"} Mar 08 00:46:57.354866 master-0 kubenswrapper[23041]: I0308 00:46:57.354832 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:57.357329 master-0 kubenswrapper[23041]: I0308 00:46:57.357296 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" event={"ID":"1a07c6c9-1b5b-4cf9-b807-2d134b67f9c4","Type":"ContainerStarted","Data":"c1ec55a152fc5683e9589901db2f6f1c33435219538bacee8a1aa3c251efebc8"} Mar 08 00:46:57.359092 master-0 kubenswrapper[23041]: I0308 00:46:57.359051 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" event={"ID":"a6baaf31-c5c2-4642-abbd-a0fd0b48528b","Type":"ContainerStarted","Data":"9e13ef10deb86a79fa5d049e363e364219106992f914d1a764d38a82b40d7589"} Mar 08 00:46:57.359233 master-0 kubenswrapper[23041]: I0308 00:46:57.359186 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:46:57.563249 master-0 kubenswrapper[23041]: I0308 00:46:57.563164 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" Mar 08 00:46:58.114370 master-0 kubenswrapper[23041]: I0308 00:46:58.114279 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-vtt2x" podStartSLOduration=5.728546002 podStartE2EDuration="18.114253887s" podCreationTimestamp="2026-03-08 00:46:40 +0000 UTC" firstStartedPulling="2026-03-08 00:46:44.118741662 +0000 UTC m=+909.591578216" lastFinishedPulling="2026-03-08 00:46:56.504449547 +0000 UTC m=+921.977286101" observedRunningTime="2026-03-08 00:46:58.102643442 +0000 UTC m=+923.575480066" watchObservedRunningTime="2026-03-08 00:46:58.114253887 +0000 UTC m=+923.587090461" Mar 08 00:46:58.258222 master-0 kubenswrapper[23041]: I0308 00:46:58.257965 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7764df74c5-mfxhq" podStartSLOduration=5.419378471 podStartE2EDuration="18.257936999s" podCreationTimestamp="2026-03-08 00:46:40 +0000 UTC" firstStartedPulling="2026-03-08 00:46:43.641893361 +0000 UTC m=+909.114729915" lastFinishedPulling="2026-03-08 00:46:56.480451889 +0000 UTC m=+921.953288443" observedRunningTime="2026-03-08 00:46:58.247689405 +0000 UTC m=+923.720525959" watchObservedRunningTime="2026-03-08 00:46:58.257936999 +0000 UTC m=+923.730773573" Mar 08 00:46:58.376316 master-0 kubenswrapper[23041]: I0308 00:46:58.376090 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lzbg5" podStartSLOduration=5.954943193 podStartE2EDuration="18.376054957s" podCreationTimestamp="2026-03-08 00:46:40 +0000 UTC" firstStartedPulling="2026-03-08 00:46:44.082382402 +0000 UTC m=+909.555218956" lastFinishedPulling="2026-03-08 00:46:56.503494166 +0000 UTC m=+921.976330720" observedRunningTime="2026-03-08 00:46:58.339319257 +0000 UTC m=+923.812155811" watchObservedRunningTime="2026-03-08 00:46:58.376054957 +0000 UTC m=+923.848891511" Mar 08 00:46:58.383522 master-0 kubenswrapper[23041]: I0308 00:46:58.383445 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" podStartSLOduration=5.904763467 podStartE2EDuration="18.383427045s" podCreationTimestamp="2026-03-08 00:46:40 +0000 UTC" firstStartedPulling="2026-03-08 00:46:44.005548007 +0000 UTC m=+909.478384561" lastFinishedPulling="2026-03-08 00:46:56.484211585 +0000 UTC m=+921.957048139" observedRunningTime="2026-03-08 00:46:58.375999815 +0000 UTC m=+923.848836389" watchObservedRunningTime="2026-03-08 00:46:58.383427045 +0000 UTC m=+923.856263599" Mar 08 00:46:58.401220 master-0 kubenswrapper[23041]: I0308 00:46:58.400773 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-7ldjw" podStartSLOduration=5.333330045 podStartE2EDuration="18.400754901s" podCreationTimestamp="2026-03-08 00:46:40 +0000 UTC" firstStartedPulling="2026-03-08 00:46:43.492845166 +0000 UTC m=+908.965681720" lastFinishedPulling="2026-03-08 00:46:56.560270022 +0000 UTC m=+922.033106576" observedRunningTime="2026-03-08 00:46:58.395835128 +0000 UTC m=+923.868671702" watchObservedRunningTime="2026-03-08 00:46:58.400754901 +0000 UTC m=+923.873591455" Mar 08 00:47:10.914589 master-0 kubenswrapper[23041]: I0308 00:47:10.914483 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-rm2zk" Mar 08 00:47:22.264469 master-0 kubenswrapper[23041]: I0308 00:47:22.264267 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" Mar 08 00:47:29.544674 master-0 kubenswrapper[23041]: I0308 00:47:29.544595 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq"] Mar 08 00:47:29.546280 master-0 kubenswrapper[23041]: I0308 00:47:29.546247 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.549260 master-0 kubenswrapper[23041]: I0308 00:47:29.549226 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 00:47:29.574376 master-0 kubenswrapper[23041]: I0308 00:47:29.574297 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq"] Mar 08 00:47:29.661360 master-0 kubenswrapper[23041]: I0308 00:47:29.660333 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vb6dz"] Mar 08 00:47:29.670242 master-0 kubenswrapper[23041]: I0308 00:47:29.665611 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.670846 master-0 kubenswrapper[23041]: I0308 00:47:29.670792 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpn5\" (UniqueName: \"kubernetes.io/projected/a9da3ea4-66bd-4ea1-8e24-26958bded14c-kube-api-access-zxpn5\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.710311 master-0 kubenswrapper[23041]: I0308 00:47:29.710241 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.723238 master-0 kubenswrapper[23041]: I0308 00:47:29.721574 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 00:47:29.723618 master-0 kubenswrapper[23041]: I0308 00:47:29.723327 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 00:47:29.751000 master-0 kubenswrapper[23041]: I0308 00:47:29.750870 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zhcsd"] Mar 08 00:47:29.757686 master-0 kubenswrapper[23041]: I0308 00:47:29.752252 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.758054 master-0 kubenswrapper[23041]: I0308 00:47:29.757889 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 00:47:29.758054 master-0 kubenswrapper[23041]: I0308 00:47:29.757996 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 00:47:29.761456 master-0 kubenswrapper[23041]: I0308 00:47:29.760324 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773375 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773461 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-startup\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773490 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-reloader\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773529 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-conf\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773563 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773614 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpn5\" (UniqueName: \"kubernetes.io/projected/a9da3ea4-66bd-4ea1-8e24-26958bded14c-kube-api-access-zxpn5\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773645 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics-certs\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773685 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr89m\" (UniqueName: \"kubernetes.io/projected/70492151-82d8-4a5a-bd83-4971d2fec18a-kube-api-access-rr89m\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: I0308 00:47:29.773714 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-sockets\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: E0308 00:47:29.773924 23041 secret.go:189] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 08 00:47:29.774584 master-0 kubenswrapper[23041]: E0308 00:47:29.773972 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert podName:a9da3ea4-66bd-4ea1-8e24-26958bded14c nodeName:}" failed. No retries permitted until 2026-03-08 00:47:30.27395363 +0000 UTC m=+955.746790184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert") pod "frr-k8s-webhook-server-7f989f654f-njhxq" (UID: "a9da3ea4-66bd-4ea1-8e24-26958bded14c") : secret "frr-k8s-webhook-server-cert" not found Mar 08 00:47:29.783720 master-0 kubenswrapper[23041]: I0308 00:47:29.783670 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-mpsmp"] Mar 08 00:47:29.785068 master-0 kubenswrapper[23041]: I0308 00:47:29.785040 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.796592 master-0 kubenswrapper[23041]: I0308 00:47:29.793945 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 00:47:29.824047 master-0 kubenswrapper[23041]: I0308 00:47:29.823973 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-mpsmp"] Mar 08 00:47:29.830884 master-0 kubenswrapper[23041]: I0308 00:47:29.827157 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpn5\" (UniqueName: \"kubernetes.io/projected/a9da3ea4-66bd-4ea1-8e24-26958bded14c-kube-api-access-zxpn5\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:29.881429 master-0 kubenswrapper[23041]: I0308 00:47:29.881354 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.881429 master-0 kubenswrapper[23041]: I0308 00:47:29.881435 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-startup\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881500 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvd2h\" (UniqueName: \"kubernetes.io/projected/74c0b2e9-7baa-4f33-b250-4c1284eada74-kube-api-access-xvd2h\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881539 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-reloader\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881565 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metallb-excludel2\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881603 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxfj\" (UniqueName: \"kubernetes.io/projected/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-kube-api-access-tsxfj\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881626 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-conf\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881643 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-metrics-certs\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881663 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metrics-certs\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881775 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics-certs\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.881804 master-0 kubenswrapper[23041]: I0308 00:47:29.881809 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-cert\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.882110 master-0 kubenswrapper[23041]: I0308 00:47:29.881845 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr89m\" (UniqueName: \"kubernetes.io/projected/70492151-82d8-4a5a-bd83-4971d2fec18a-kube-api-access-rr89m\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.882110 master-0 kubenswrapper[23041]: I0308 00:47:29.881896 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-sockets\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.882110 master-0 kubenswrapper[23041]: I0308 00:47:29.881938 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.884670 master-0 kubenswrapper[23041]: I0308 00:47:29.883038 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.884670 master-0 kubenswrapper[23041]: I0308 00:47:29.883966 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-startup\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.884670 master-0 kubenswrapper[23041]: I0308 00:47:29.884167 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-reloader\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.884670 master-0 kubenswrapper[23041]: I0308 00:47:29.884585 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-conf\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.885523 master-0 kubenswrapper[23041]: I0308 00:47:29.885311 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/70492151-82d8-4a5a-bd83-4971d2fec18a-frr-sockets\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.889064 master-0 kubenswrapper[23041]: I0308 00:47:29.889031 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70492151-82d8-4a5a-bd83-4971d2fec18a-metrics-certs\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.903160 master-0 kubenswrapper[23041]: I0308 00:47:29.903108 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr89m\" (UniqueName: \"kubernetes.io/projected/70492151-82d8-4a5a-bd83-4971d2fec18a-kube-api-access-rr89m\") pod \"frr-k8s-vb6dz\" (UID: \"70492151-82d8-4a5a-bd83-4971d2fec18a\") " pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:29.983799 master-0 kubenswrapper[23041]: I0308 00:47:29.983702 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-metrics-certs\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.983799 master-0 kubenswrapper[23041]: I0308 00:47:29.983791 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metrics-certs\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.984180 master-0 kubenswrapper[23041]: I0308 00:47:29.983891 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-cert\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.984180 master-0 kubenswrapper[23041]: I0308 00:47:29.983942 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.984180 master-0 kubenswrapper[23041]: I0308 00:47:29.984032 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd2h\" (UniqueName: \"kubernetes.io/projected/74c0b2e9-7baa-4f33-b250-4c1284eada74-kube-api-access-xvd2h\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:29.984180 master-0 kubenswrapper[23041]: I0308 00:47:29.984061 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metallb-excludel2\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.984180 master-0 kubenswrapper[23041]: I0308 00:47:29.984094 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxfj\" (UniqueName: \"kubernetes.io/projected/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-kube-api-access-tsxfj\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.984897 master-0 kubenswrapper[23041]: E0308 00:47:29.984715 23041 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 00:47:29.984897 master-0 kubenswrapper[23041]: E0308 00:47:29.984841 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist podName:4eda3bf2-0d01-41dc-89aa-91375f84b6e9 nodeName:}" failed. No retries permitted until 2026-03-08 00:47:30.484817876 +0000 UTC m=+955.957654520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist") pod "speaker-zhcsd" (UID: "4eda3bf2-0d01-41dc-89aa-91375f84b6e9") : secret "metallb-memberlist" not found Mar 08 00:47:29.988434 master-0 kubenswrapper[23041]: I0308 00:47:29.985486 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metallb-excludel2\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.988949 master-0 kubenswrapper[23041]: I0308 00:47:29.988875 23041 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 00:47:29.989311 master-0 kubenswrapper[23041]: I0308 00:47:29.989276 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-metrics-certs\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:29.993279 master-0 kubenswrapper[23041]: I0308 00:47:29.992262 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-metrics-certs\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:30.002155 master-0 kubenswrapper[23041]: I0308 00:47:30.002101 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74c0b2e9-7baa-4f33-b250-4c1284eada74-cert\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:30.004924 master-0 kubenswrapper[23041]: I0308 00:47:30.004876 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd2h\" (UniqueName: \"kubernetes.io/projected/74c0b2e9-7baa-4f33-b250-4c1284eada74-kube-api-access-xvd2h\") pod \"controller-86ddb6bd46-mpsmp\" (UID: \"74c0b2e9-7baa-4f33-b250-4c1284eada74\") " pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:30.007567 master-0 kubenswrapper[23041]: I0308 00:47:30.007532 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxfj\" (UniqueName: \"kubernetes.io/projected/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-kube-api-access-tsxfj\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:30.100154 master-0 kubenswrapper[23041]: I0308 00:47:30.099911 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:30.172244 master-0 kubenswrapper[23041]: I0308 00:47:30.172147 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:30.289849 master-0 kubenswrapper[23041]: I0308 00:47:30.289767 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:30.294715 master-0 kubenswrapper[23041]: I0308 00:47:30.294647 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9da3ea4-66bd-4ea1-8e24-26958bded14c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-njhxq\" (UID: \"a9da3ea4-66bd-4ea1-8e24-26958bded14c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:30.475765 master-0 kubenswrapper[23041]: I0308 00:47:30.475576 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:30.493618 master-0 kubenswrapper[23041]: I0308 00:47:30.493520 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:30.493892 master-0 kubenswrapper[23041]: E0308 00:47:30.493733 23041 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 00:47:30.493892 master-0 kubenswrapper[23041]: E0308 00:47:30.493838 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist podName:4eda3bf2-0d01-41dc-89aa-91375f84b6e9 nodeName:}" failed. No retries permitted until 2026-03-08 00:47:31.493811912 +0000 UTC m=+956.966648466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist") pod "speaker-zhcsd" (UID: "4eda3bf2-0d01-41dc-89aa-91375f84b6e9") : secret "metallb-memberlist" not found Mar 08 00:47:31.090015 master-0 kubenswrapper[23041]: I0308 00:47:31.085903 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"9a684406245dc6fc971848810dc7fb5caf431762dd18e32b987ab7bc901244a3"} Mar 08 00:47:31.140039 master-0 kubenswrapper[23041]: I0308 00:47:31.132414 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-mpsmp"] Mar 08 00:47:31.153133 master-0 kubenswrapper[23041]: W0308 00:47:31.152865 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c0b2e9_7baa_4f33_b250_4c1284eada74.slice/crio-0308887e3bfd8968c196ce9ef5872117e2848f8632b7d704caf9e7f1a0eda5e8 WatchSource:0}: Error finding container 0308887e3bfd8968c196ce9ef5872117e2848f8632b7d704caf9e7f1a0eda5e8: Status 404 returned error can't find the container with id 0308887e3bfd8968c196ce9ef5872117e2848f8632b7d704caf9e7f1a0eda5e8 Mar 08 00:47:31.436732 master-0 kubenswrapper[23041]: W0308 00:47:31.436646 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9da3ea4_66bd_4ea1_8e24_26958bded14c.slice/crio-1fb1b4864862fd9f238882cc100e47b71331052c22cad091e8428473a03d1b93 WatchSource:0}: Error finding container 1fb1b4864862fd9f238882cc100e47b71331052c22cad091e8428473a03d1b93: Status 404 returned error can't find the container with id 1fb1b4864862fd9f238882cc100e47b71331052c22cad091e8428473a03d1b93 Mar 08 00:47:31.439389 master-0 kubenswrapper[23041]: I0308 00:47:31.439331 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq"] Mar 08 00:47:31.585951 master-0 kubenswrapper[23041]: I0308 00:47:31.585429 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:31.606609 master-0 kubenswrapper[23041]: I0308 00:47:31.606554 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4eda3bf2-0d01-41dc-89aa-91375f84b6e9-memberlist\") pod \"speaker-zhcsd\" (UID: \"4eda3bf2-0d01-41dc-89aa-91375f84b6e9\") " pod="metallb-system/speaker-zhcsd" Mar 08 00:47:31.611285 master-0 kubenswrapper[23041]: I0308 00:47:31.609731 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-xln25"] Mar 08 00:47:31.613009 master-0 kubenswrapper[23041]: I0308 00:47:31.612442 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" Mar 08 00:47:31.634900 master-0 kubenswrapper[23041]: I0308 00:47:31.634838 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zhcsd" Mar 08 00:47:31.635138 master-0 kubenswrapper[23041]: I0308 00:47:31.634954 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx"] Mar 08 00:47:31.640589 master-0 kubenswrapper[23041]: I0308 00:47:31.640524 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.656277 master-0 kubenswrapper[23041]: I0308 00:47:31.655065 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 00:47:31.661407 master-0 kubenswrapper[23041]: I0308 00:47:31.660292 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx"] Mar 08 00:47:31.678625 master-0 kubenswrapper[23041]: I0308 00:47:31.675579 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-d7nd4"] Mar 08 00:47:31.678625 master-0 kubenswrapper[23041]: I0308 00:47:31.676966 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.688730 master-0 kubenswrapper[23041]: I0308 00:47:31.687104 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtjn\" (UniqueName: \"kubernetes.io/projected/455c5b8f-0d89-4e55-9785-7285656f7cfe-kube-api-access-ldtjn\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.688730 master-0 kubenswrapper[23041]: I0308 00:47:31.687243 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2z4\" (UniqueName: \"kubernetes.io/projected/3faca2f4-386d-446c-9f19-8534d892941c-kube-api-access-qx2z4\") pod \"nmstate-metrics-69594cc75-xln25\" (UID: \"3faca2f4-386d-446c-9f19-8534d892941c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" Mar 08 00:47:31.688730 master-0 kubenswrapper[23041]: I0308 00:47:31.687298 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/455c5b8f-0d89-4e55-9785-7285656f7cfe-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.698662 master-0 kubenswrapper[23041]: I0308 00:47:31.698276 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-xln25"] Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.797931 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-nmstate-lock\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798010 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2z4\" (UniqueName: \"kubernetes.io/projected/3faca2f4-386d-446c-9f19-8534d892941c-kube-api-access-qx2z4\") pod \"nmstate-metrics-69594cc75-xln25\" (UID: \"3faca2f4-386d-446c-9f19-8534d892941c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798061 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hb5\" (UniqueName: \"kubernetes.io/projected/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-kube-api-access-28hb5\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798109 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/455c5b8f-0d89-4e55-9785-7285656f7cfe-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798153 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-dbus-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798220 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-ovs-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.799420 master-0 kubenswrapper[23041]: I0308 00:47:31.798245 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtjn\" (UniqueName: \"kubernetes.io/projected/455c5b8f-0d89-4e55-9785-7285656f7cfe-kube-api-access-ldtjn\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.816937 master-0 kubenswrapper[23041]: I0308 00:47:31.814810 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/455c5b8f-0d89-4e55-9785-7285656f7cfe-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.824774 master-0 kubenswrapper[23041]: I0308 00:47:31.824678 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4"] Mar 08 00:47:31.826645 master-0 kubenswrapper[23041]: I0308 00:47:31.826603 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:31.832561 master-0 kubenswrapper[23041]: I0308 00:47:31.831103 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 00:47:31.846942 master-0 kubenswrapper[23041]: I0308 00:47:31.833039 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 00:47:31.846942 master-0 kubenswrapper[23041]: I0308 00:47:31.833905 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2z4\" (UniqueName: \"kubernetes.io/projected/3faca2f4-386d-446c-9f19-8534d892941c-kube-api-access-qx2z4\") pod \"nmstate-metrics-69594cc75-xln25\" (UID: \"3faca2f4-386d-446c-9f19-8534d892941c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" Mar 08 00:47:31.872154 master-0 kubenswrapper[23041]: I0308 00:47:31.872095 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtjn\" (UniqueName: \"kubernetes.io/projected/455c5b8f-0d89-4e55-9785-7285656f7cfe-kube-api-access-ldtjn\") pod \"nmstate-webhook-786f45cff4-v5hhx\" (UID: \"455c5b8f-0d89-4e55-9785-7285656f7cfe\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:31.872537 master-0 kubenswrapper[23041]: I0308 00:47:31.872175 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4"] Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909109 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4cfd42-515f-4390-9e98-42361d3f96d9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909197 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-dbus-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909251 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6dc\" (UniqueName: \"kubernetes.io/projected/2b4cfd42-515f-4390-9e98-42361d3f96d9-kube-api-access-cz6dc\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909324 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-ovs-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909424 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-nmstate-lock\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909484 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hb5\" (UniqueName: \"kubernetes.io/projected/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-kube-api-access-28hb5\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909517 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b4cfd42-515f-4390-9e98-42361d3f96d9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.909724 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-dbus-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.910106 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-ovs-socket\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.912953 master-0 kubenswrapper[23041]: I0308 00:47:31.910665 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-nmstate-lock\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.956135 master-0 kubenswrapper[23041]: I0308 00:47:31.956080 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hb5\" (UniqueName: \"kubernetes.io/projected/7e7d451d-96a5-4aaf-b44f-c6959ae142ea-kube-api-access-28hb5\") pod \"nmstate-handler-d7nd4\" (UID: \"7e7d451d-96a5-4aaf-b44f-c6959ae142ea\") " pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:31.985797 master-0 kubenswrapper[23041]: I0308 00:47:31.985731 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" Mar 08 00:47:32.002616 master-0 kubenswrapper[23041]: I0308 00:47:31.994880 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:32.015288 master-0 kubenswrapper[23041]: I0308 00:47:32.013995 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6dc\" (UniqueName: \"kubernetes.io/projected/2b4cfd42-515f-4390-9e98-42361d3f96d9-kube-api-access-cz6dc\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.015288 master-0 kubenswrapper[23041]: I0308 00:47:32.014153 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b4cfd42-515f-4390-9e98-42361d3f96d9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.015288 master-0 kubenswrapper[23041]: I0308 00:47:32.014180 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4cfd42-515f-4390-9e98-42361d3f96d9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.023020 master-0 kubenswrapper[23041]: I0308 00:47:32.018404 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b4cfd42-515f-4390-9e98-42361d3f96d9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.023020 master-0 kubenswrapper[23041]: I0308 00:47:32.018785 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:32.027351 master-0 kubenswrapper[23041]: I0308 00:47:32.027218 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4cfd42-515f-4390-9e98-42361d3f96d9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.046807 master-0 kubenswrapper[23041]: I0308 00:47:32.046718 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cb5f6487-gmcnf"] Mar 08 00:47:32.051220 master-0 kubenswrapper[23041]: I0308 00:47:32.048855 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.054348 master-0 kubenswrapper[23041]: I0308 00:47:32.054302 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6dc\" (UniqueName: \"kubernetes.io/projected/2b4cfd42-515f-4390-9e98-42361d3f96d9-kube-api-access-cz6dc\") pod \"nmstate-console-plugin-5dcbbd79cf-5bcf4\" (UID: \"2b4cfd42-515f-4390-9e98-42361d3f96d9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.060708 master-0 kubenswrapper[23041]: I0308 00:47:32.059714 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb5f6487-gmcnf"] Mar 08 00:47:32.115810 master-0 kubenswrapper[23041]: I0308 00:47:32.115671 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-oauth-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.115810 master-0 kubenswrapper[23041]: I0308 00:47:32.115725 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-oauth-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.115810 master-0 kubenswrapper[23041]: I0308 00:47:32.115751 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-trusted-ca-bundle\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.115810 master-0 kubenswrapper[23041]: I0308 00:47:32.115768 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-console-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.115810 master-0 kubenswrapper[23041]: I0308 00:47:32.115801 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnmf\" (UniqueName: \"kubernetes.io/projected/b93005f2-add9-4e9e-b005-383337ae1a05-kube-api-access-qjnmf\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.119976 master-0 kubenswrapper[23041]: I0308 00:47:32.115829 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-service-ca\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.119976 master-0 kubenswrapper[23041]: I0308 00:47:32.115846 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.142395 master-0 kubenswrapper[23041]: I0308 00:47:32.142242 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zhcsd" event={"ID":"4eda3bf2-0d01-41dc-89aa-91375f84b6e9","Type":"ContainerStarted","Data":"803cf90384ae73766e63ce9d15d4e0a02ce1a3dc819935e2c319beeb5cbb52a5"} Mar 08 00:47:32.146640 master-0 kubenswrapper[23041]: I0308 00:47:32.146570 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" event={"ID":"a9da3ea4-66bd-4ea1-8e24-26958bded14c","Type":"ContainerStarted","Data":"1fb1b4864862fd9f238882cc100e47b71331052c22cad091e8428473a03d1b93"} Mar 08 00:47:32.149685 master-0 kubenswrapper[23041]: I0308 00:47:32.149621 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-mpsmp" event={"ID":"74c0b2e9-7baa-4f33-b250-4c1284eada74","Type":"ContainerStarted","Data":"bdfffd2401f6e5112bb158833076db507c84061fa27284427a3ee2577d4207ee"} Mar 08 00:47:32.149886 master-0 kubenswrapper[23041]: I0308 00:47:32.149702 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-mpsmp" event={"ID":"74c0b2e9-7baa-4f33-b250-4c1284eada74","Type":"ContainerStarted","Data":"0308887e3bfd8968c196ce9ef5872117e2848f8632b7d704caf9e7f1a0eda5e8"} Mar 08 00:47:32.192750 master-0 kubenswrapper[23041]: I0308 00:47:32.192683 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" Mar 08 00:47:32.223981 master-0 kubenswrapper[23041]: I0308 00:47:32.223814 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-service-ca\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.223981 master-0 kubenswrapper[23041]: I0308 00:47:32.223870 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.225319 master-0 kubenswrapper[23041]: I0308 00:47:32.224480 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-oauth-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.225319 master-0 kubenswrapper[23041]: I0308 00:47:32.224511 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-oauth-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.225319 master-0 kubenswrapper[23041]: I0308 00:47:32.224533 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-trusted-ca-bundle\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.225319 master-0 kubenswrapper[23041]: I0308 00:47:32.224551 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-console-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.225319 master-0 kubenswrapper[23041]: I0308 00:47:32.224582 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnmf\" (UniqueName: \"kubernetes.io/projected/b93005f2-add9-4e9e-b005-383337ae1a05-kube-api-access-qjnmf\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.226288 master-0 kubenswrapper[23041]: I0308 00:47:32.225528 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-service-ca\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.227332 master-0 kubenswrapper[23041]: I0308 00:47:32.225869 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-oauth-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.227391 master-0 kubenswrapper[23041]: I0308 00:47:32.226178 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-trusted-ca-bundle\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.227391 master-0 kubenswrapper[23041]: I0308 00:47:32.226819 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b93005f2-add9-4e9e-b005-383337ae1a05-console-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.234984 master-0 kubenswrapper[23041]: I0308 00:47:32.234933 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-serving-cert\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.247316 master-0 kubenswrapper[23041]: I0308 00:47:32.246440 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b93005f2-add9-4e9e-b005-383337ae1a05-console-oauth-config\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.267811 master-0 kubenswrapper[23041]: I0308 00:47:32.266700 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnmf\" (UniqueName: \"kubernetes.io/projected/b93005f2-add9-4e9e-b005-383337ae1a05-kube-api-access-qjnmf\") pod \"console-cb5f6487-gmcnf\" (UID: \"b93005f2-add9-4e9e-b005-383337ae1a05\") " pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.446873 master-0 kubenswrapper[23041]: I0308 00:47:32.446001 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:32.608184 master-0 kubenswrapper[23041]: I0308 00:47:32.608123 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-xln25"] Mar 08 00:47:32.794373 master-0 kubenswrapper[23041]: I0308 00:47:32.794320 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx"] Mar 08 00:47:32.874844 master-0 kubenswrapper[23041]: I0308 00:47:32.874787 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4"] Mar 08 00:47:32.875122 master-0 kubenswrapper[23041]: W0308 00:47:32.874842 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4cfd42_515f_4390_9e98_42361d3f96d9.slice/crio-71d2dc2b301d66709515dc2175eb89c62bb3e6acfd42f959d16bc056f10ccb82 WatchSource:0}: Error finding container 71d2dc2b301d66709515dc2175eb89c62bb3e6acfd42f959d16bc056f10ccb82: Status 404 returned error can't find the container with id 71d2dc2b301d66709515dc2175eb89c62bb3e6acfd42f959d16bc056f10ccb82 Mar 08 00:47:33.023473 master-0 kubenswrapper[23041]: W0308 00:47:33.023376 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93005f2_add9_4e9e_b005_383337ae1a05.slice/crio-278c4a6b9eb877b4195ccc9810042ddcdf0650d6df14d19fb07e984237f9736c WatchSource:0}: Error finding container 278c4a6b9eb877b4195ccc9810042ddcdf0650d6df14d19fb07e984237f9736c: Status 404 returned error can't find the container with id 278c4a6b9eb877b4195ccc9810042ddcdf0650d6df14d19fb07e984237f9736c Mar 08 00:47:33.025369 master-0 kubenswrapper[23041]: I0308 00:47:33.025325 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb5f6487-gmcnf"] Mar 08 00:47:33.162802 master-0 kubenswrapper[23041]: I0308 00:47:33.162739 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" event={"ID":"2b4cfd42-515f-4390-9e98-42361d3f96d9","Type":"ContainerStarted","Data":"71d2dc2b301d66709515dc2175eb89c62bb3e6acfd42f959d16bc056f10ccb82"} Mar 08 00:47:33.175582 master-0 kubenswrapper[23041]: I0308 00:47:33.165667 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zhcsd" event={"ID":"4eda3bf2-0d01-41dc-89aa-91375f84b6e9","Type":"ContainerStarted","Data":"a209cfeb2787422cb3ea1b5d3a6e393af45a98a688ba8b227207c60e56f5f433"} Mar 08 00:47:33.175582 master-0 kubenswrapper[23041]: I0308 00:47:33.167394 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d7nd4" event={"ID":"7e7d451d-96a5-4aaf-b44f-c6959ae142ea","Type":"ContainerStarted","Data":"1e8f18c088591808eb9aad9f87de095ecbb6e95e1de0872fe65d0607bbb9592f"} Mar 08 00:47:33.175582 master-0 kubenswrapper[23041]: I0308 00:47:33.170763 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-mpsmp" event={"ID":"74c0b2e9-7baa-4f33-b250-4c1284eada74","Type":"ContainerStarted","Data":"e30584932e9f6b98da4adcbece611618aab3d92a4c13a069dc7739a65f16fd85"} Mar 08 00:47:33.175582 master-0 kubenswrapper[23041]: I0308 00:47:33.172322 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:33.175582 master-0 kubenswrapper[23041]: I0308 00:47:33.173844 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" event={"ID":"455c5b8f-0d89-4e55-9785-7285656f7cfe","Type":"ContainerStarted","Data":"56348f021b95f99d20b1ba98e79131eac65f366e676e2d5ce8c8cc4080aeafab"} Mar 08 00:47:33.180308 master-0 kubenswrapper[23041]: I0308 00:47:33.180256 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" event={"ID":"3faca2f4-386d-446c-9f19-8534d892941c","Type":"ContainerStarted","Data":"54d9766a6da11a9779d161ebddadca14cde0a0f1e1fb08b6c59aa2c063e19d82"} Mar 08 00:47:33.183903 master-0 kubenswrapper[23041]: I0308 00:47:33.183855 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb5f6487-gmcnf" event={"ID":"b93005f2-add9-4e9e-b005-383337ae1a05","Type":"ContainerStarted","Data":"278c4a6b9eb877b4195ccc9810042ddcdf0650d6df14d19fb07e984237f9736c"} Mar 08 00:47:33.200334 master-0 kubenswrapper[23041]: I0308 00:47:33.200250 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-mpsmp" podStartSLOduration=2.8611332320000002 podStartE2EDuration="4.200226578s" podCreationTimestamp="2026-03-08 00:47:29 +0000 UTC" firstStartedPulling="2026-03-08 00:47:31.378721333 +0000 UTC m=+956.851557897" lastFinishedPulling="2026-03-08 00:47:32.717814689 +0000 UTC m=+958.190651243" observedRunningTime="2026-03-08 00:47:33.196724538 +0000 UTC m=+958.669561102" watchObservedRunningTime="2026-03-08 00:47:33.200226578 +0000 UTC m=+958.673063132" Mar 08 00:47:34.220311 master-0 kubenswrapper[23041]: I0308 00:47:34.220221 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb5f6487-gmcnf" event={"ID":"b93005f2-add9-4e9e-b005-383337ae1a05","Type":"ContainerStarted","Data":"a287210b9b288bc26bfd90e173321ad5a47080cb9f5ba8f7dd472665a8c587ec"} Mar 08 00:47:34.226775 master-0 kubenswrapper[23041]: I0308 00:47:34.226649 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zhcsd" event={"ID":"4eda3bf2-0d01-41dc-89aa-91375f84b6e9","Type":"ContainerStarted","Data":"34e585271a4ad9225db23072c9cf681b649c190555327fc167d477e53938fe9f"} Mar 08 00:47:34.227295 master-0 kubenswrapper[23041]: I0308 00:47:34.227252 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zhcsd" Mar 08 00:47:34.251760 master-0 kubenswrapper[23041]: I0308 00:47:34.251520 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cb5f6487-gmcnf" podStartSLOduration=3.251438958 podStartE2EDuration="3.251438958s" podCreationTimestamp="2026-03-08 00:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:47:34.24714227 +0000 UTC m=+959.719978844" watchObservedRunningTime="2026-03-08 00:47:34.251438958 +0000 UTC m=+959.724275512" Mar 08 00:47:34.293920 master-0 kubenswrapper[23041]: I0308 00:47:34.292725 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zhcsd" podStartSLOduration=4.2481669029999996 podStartE2EDuration="5.292698451s" podCreationTimestamp="2026-03-08 00:47:29 +0000 UTC" firstStartedPulling="2026-03-08 00:47:32.264638268 +0000 UTC m=+957.737474822" lastFinishedPulling="2026-03-08 00:47:33.309169806 +0000 UTC m=+958.782006370" observedRunningTime="2026-03-08 00:47:34.267103396 +0000 UTC m=+959.739939960" watchObservedRunningTime="2026-03-08 00:47:34.292698451 +0000 UTC m=+959.765535015" Mar 08 00:47:40.298654 master-0 kubenswrapper[23041]: I0308 00:47:40.298560 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" event={"ID":"a9da3ea4-66bd-4ea1-8e24-26958bded14c","Type":"ContainerStarted","Data":"60d22af82455f26b5314c5f096d5aea8474231d3278fae1b909d90994fd6eaaa"} Mar 08 00:47:40.299369 master-0 kubenswrapper[23041]: I0308 00:47:40.299348 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:40.300887 master-0 kubenswrapper[23041]: I0308 00:47:40.300866 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d7nd4" event={"ID":"7e7d451d-96a5-4aaf-b44f-c6959ae142ea","Type":"ContainerStarted","Data":"52f51588fde1861db5d393b2195265e9ca27cd5f487d3691ee0fd48af69c44a1"} Mar 08 00:47:40.301003 master-0 kubenswrapper[23041]: I0308 00:47:40.300988 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:40.302788 master-0 kubenswrapper[23041]: I0308 00:47:40.302760 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" event={"ID":"455c5b8f-0d89-4e55-9785-7285656f7cfe","Type":"ContainerStarted","Data":"4ac494ae6f4e83cdb26ed7525c7915f31e65d735898af8927619134ec5a8501a"} Mar 08 00:47:40.303063 master-0 kubenswrapper[23041]: I0308 00:47:40.303017 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:40.307223 master-0 kubenswrapper[23041]: I0308 00:47:40.304564 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" event={"ID":"3faca2f4-386d-446c-9f19-8534d892941c","Type":"ContainerStarted","Data":"aebf2d4b72c4e3d7eeca408c33a53ff0d22527ac2eb86e72b5e1e025dc8f5d22"} Mar 08 00:47:40.307223 master-0 kubenswrapper[23041]: I0308 00:47:40.304588 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" event={"ID":"3faca2f4-386d-446c-9f19-8534d892941c","Type":"ContainerStarted","Data":"dade7cca2ef1dc9f3338ac7cdec5663547263726a7bc21231383df2027fce3e5"} Mar 08 00:47:40.307223 master-0 kubenswrapper[23041]: I0308 00:47:40.306472 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" event={"ID":"2b4cfd42-515f-4390-9e98-42361d3f96d9","Type":"ContainerStarted","Data":"7222eb68326ca5ca8cd5f9d3829176fa5b33fc40ba468a2df1dc8ec404323f85"} Mar 08 00:47:40.311344 master-0 kubenswrapper[23041]: I0308 00:47:40.308376 23041 generic.go:334] "Generic (PLEG): container finished" podID="70492151-82d8-4a5a-bd83-4971d2fec18a" containerID="7e4c9e46adc634e851ff05e2f867142315d7c66713f341e87f100cbcc93d7173" exitCode=0 Mar 08 00:47:40.311344 master-0 kubenswrapper[23041]: I0308 00:47:40.308408 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerDied","Data":"7e4c9e46adc634e851ff05e2f867142315d7c66713f341e87f100cbcc93d7173"} Mar 08 00:47:40.368608 master-0 kubenswrapper[23041]: I0308 00:47:40.368482 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-d7nd4" podStartSLOduration=1.815209334 podStartE2EDuration="9.368459254s" podCreationTimestamp="2026-03-08 00:47:31 +0000 UTC" firstStartedPulling="2026-03-08 00:47:32.190451534 +0000 UTC m=+957.663288088" lastFinishedPulling="2026-03-08 00:47:39.743701444 +0000 UTC m=+965.216538008" observedRunningTime="2026-03-08 00:47:40.363769716 +0000 UTC m=+965.836606280" watchObservedRunningTime="2026-03-08 00:47:40.368459254 +0000 UTC m=+965.841295808" Mar 08 00:47:40.373237 master-0 kubenswrapper[23041]: I0308 00:47:40.373174 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" podStartSLOduration=3.053460125 podStartE2EDuration="11.373160431s" podCreationTimestamp="2026-03-08 00:47:29 +0000 UTC" firstStartedPulling="2026-03-08 00:47:31.441023646 +0000 UTC m=+956.913860200" lastFinishedPulling="2026-03-08 00:47:39.760723952 +0000 UTC m=+965.233560506" observedRunningTime="2026-03-08 00:47:40.335442979 +0000 UTC m=+965.808279533" watchObservedRunningTime="2026-03-08 00:47:40.373160431 +0000 UTC m=+965.845996985" Mar 08 00:47:40.394361 master-0 kubenswrapper[23041]: I0308 00:47:40.394260 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" podStartSLOduration=2.475462924 podStartE2EDuration="9.394238482s" podCreationTimestamp="2026-03-08 00:47:31 +0000 UTC" firstStartedPulling="2026-03-08 00:47:32.791477092 +0000 UTC m=+958.264313646" lastFinishedPulling="2026-03-08 00:47:39.71025265 +0000 UTC m=+965.183089204" observedRunningTime="2026-03-08 00:47:40.386808033 +0000 UTC m=+965.859644597" watchObservedRunningTime="2026-03-08 00:47:40.394238482 +0000 UTC m=+965.867075036" Mar 08 00:47:40.514142 master-0 kubenswrapper[23041]: I0308 00:47:40.513455 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-xln25" podStartSLOduration=2.364124252 podStartE2EDuration="9.513421155s" podCreationTimestamp="2026-03-08 00:47:31 +0000 UTC" firstStartedPulling="2026-03-08 00:47:32.614384197 +0000 UTC m=+958.087220751" lastFinishedPulling="2026-03-08 00:47:39.7636811 +0000 UTC m=+965.236517654" observedRunningTime="2026-03-08 00:47:40.454138061 +0000 UTC m=+965.926974635" watchObservedRunningTime="2026-03-08 00:47:40.513421155 +0000 UTC m=+965.986257719" Mar 08 00:47:40.526748 master-0 kubenswrapper[23041]: I0308 00:47:40.526328 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5bcf4" podStartSLOduration=2.6880851310000002 podStartE2EDuration="9.526303249s" podCreationTimestamp="2026-03-08 00:47:31 +0000 UTC" firstStartedPulling="2026-03-08 00:47:32.87853691 +0000 UTC m=+958.351373484" lastFinishedPulling="2026-03-08 00:47:39.716755048 +0000 UTC m=+965.189591602" observedRunningTime="2026-03-08 00:47:40.476497621 +0000 UTC m=+965.949334175" watchObservedRunningTime="2026-03-08 00:47:40.526303249 +0000 UTC m=+965.999139813" Mar 08 00:47:41.326747 master-0 kubenswrapper[23041]: I0308 00:47:41.326680 23041 generic.go:334] "Generic (PLEG): container finished" podID="70492151-82d8-4a5a-bd83-4971d2fec18a" containerID="2c98c7a72dfbe9e425c50dbd3e250e46eddc42dc3da302551c861b4e437da9cb" exitCode=0 Mar 08 00:47:41.327498 master-0 kubenswrapper[23041]: I0308 00:47:41.327450 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerDied","Data":"2c98c7a72dfbe9e425c50dbd3e250e46eddc42dc3da302551c861b4e437da9cb"} Mar 08 00:47:42.346316 master-0 kubenswrapper[23041]: I0308 00:47:42.346243 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerDied","Data":"8dcba0e86c4202965df4c59e979a4d715bf7514cc037328411f6b6e9f2374ad6"} Mar 08 00:47:42.347046 master-0 kubenswrapper[23041]: I0308 00:47:42.346192 23041 generic.go:334] "Generic (PLEG): container finished" podID="70492151-82d8-4a5a-bd83-4971d2fec18a" containerID="8dcba0e86c4202965df4c59e979a4d715bf7514cc037328411f6b6e9f2374ad6" exitCode=0 Mar 08 00:47:43.387142 master-0 kubenswrapper[23041]: I0308 00:47:43.386892 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-86db79fc85-g44m9" podUID="91aedad8-165a-49f5-9aa2-e87be58d353e" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.127:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:47:43.432482 master-0 kubenswrapper[23041]: I0308 00:47:43.432289 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:43.432482 master-0 kubenswrapper[23041]: I0308 00:47:43.432383 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:43.432482 master-0 kubenswrapper[23041]: I0308 00:47:43.432395 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:43.436904 master-0 kubenswrapper[23041]: I0308 00:47:43.436723 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cb5f6487-gmcnf" Mar 08 00:47:43.518167 master-0 kubenswrapper[23041]: I0308 00:47:43.518087 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:47:44.404297 master-0 kubenswrapper[23041]: I0308 00:47:44.403398 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"7b9172d07c51a11fd9893e54c4e8e47fd987c10a4852d2b4a97b0809fe3b6542"} Mar 08 00:47:44.404297 master-0 kubenswrapper[23041]: I0308 00:47:44.403467 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"8c8e4f0a0a50083571b161ca46b9410d42ef1515f5e83a8770628e2f68b8f243"} Mar 08 00:47:44.404297 master-0 kubenswrapper[23041]: I0308 00:47:44.403478 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"7a743e600866266a20ab8fd9b8499961e562387dceb4a52d59611d1a53c130f5"} Mar 08 00:47:44.404297 master-0 kubenswrapper[23041]: I0308 00:47:44.403487 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"7ed153532437cc8271124fe544f7d2b507df167f75c93716d8dd7f9b4d79d591"} Mar 08 00:47:45.437352 master-0 kubenswrapper[23041]: I0308 00:47:45.437169 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"e166468ebb2eef5e645b3f2751aa789b9599e89b15ece1cb3453ad1e2bad6b20"} Mar 08 00:47:45.437352 master-0 kubenswrapper[23041]: I0308 00:47:45.437284 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vb6dz" event={"ID":"70492151-82d8-4a5a-bd83-4971d2fec18a","Type":"ContainerStarted","Data":"832d59e52e49dbba5a097498359c676e4cebbbfe42e992fa75e96ea9ca5b2cb5"} Mar 08 00:47:45.472485 master-0 kubenswrapper[23041]: I0308 00:47:45.472359 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vb6dz" podStartSLOduration=6.95375209 podStartE2EDuration="16.472328849s" podCreationTimestamp="2026-03-08 00:47:29 +0000 UTC" firstStartedPulling="2026-03-08 00:47:30.240508126 +0000 UTC m=+955.713344680" lastFinishedPulling="2026-03-08 00:47:39.759084885 +0000 UTC m=+965.231921439" observedRunningTime="2026-03-08 00:47:45.470421225 +0000 UTC m=+970.943257789" watchObservedRunningTime="2026-03-08 00:47:45.472328849 +0000 UTC m=+970.945165413" Mar 08 00:47:46.446110 master-0 kubenswrapper[23041]: I0308 00:47:46.446038 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:47.040517 master-0 kubenswrapper[23041]: I0308 00:47:47.040468 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-d7nd4" Mar 08 00:47:50.100808 master-0 kubenswrapper[23041]: I0308 00:47:50.100694 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:50.145008 master-0 kubenswrapper[23041]: I0308 00:47:50.144934 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:47:50.176779 master-0 kubenswrapper[23041]: I0308 00:47:50.176717 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-mpsmp" Mar 08 00:47:50.484859 master-0 kubenswrapper[23041]: I0308 00:47:50.484644 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-njhxq" Mar 08 00:47:51.639845 master-0 kubenswrapper[23041]: I0308 00:47:51.639771 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zhcsd" Mar 08 00:47:52.004971 master-0 kubenswrapper[23041]: I0308 00:47:52.004900 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-v5hhx" Mar 08 00:47:57.318410 master-0 kubenswrapper[23041]: I0308 00:47:57.318326 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-l6zlx"] Mar 08 00:47:57.319806 master-0 kubenswrapper[23041]: I0308 00:47:57.319777 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.325142 master-0 kubenswrapper[23041]: I0308 00:47:57.325107 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 08 00:47:57.336014 master-0 kubenswrapper[23041]: I0308 00:47:57.335958 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-l6zlx"] Mar 08 00:47:57.450467 master-0 kubenswrapper[23041]: I0308 00:47:57.450380 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-lvmd-config\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450467 master-0 kubenswrapper[23041]: I0308 00:47:57.450441 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-file-lock-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450467 master-0 kubenswrapper[23041]: I0308 00:47:57.450465 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-pod-volumes-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450855 master-0 kubenswrapper[23041]: I0308 00:47:57.450501 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/25ac32de-5490-48f3-bccb-f5ce7bfae58c-metrics-cert\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450855 master-0 kubenswrapper[23041]: I0308 00:47:57.450521 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-device-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450855 master-0 kubenswrapper[23041]: I0308 00:47:57.450627 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-registration-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450855 master-0 kubenswrapper[23041]: I0308 00:47:57.450705 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-sys\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.450855 master-0 kubenswrapper[23041]: I0308 00:47:57.450803 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-csi-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.451023 master-0 kubenswrapper[23041]: I0308 00:47:57.450905 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t89st\" (UniqueName: \"kubernetes.io/projected/25ac32de-5490-48f3-bccb-f5ce7bfae58c-kube-api-access-t89st\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.451023 master-0 kubenswrapper[23041]: I0308 00:47:57.450997 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-run-udev\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.451125 master-0 kubenswrapper[23041]: I0308 00:47:57.451102 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-node-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.553363 master-0 kubenswrapper[23041]: I0308 00:47:57.553308 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-csi-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.553662 master-0 kubenswrapper[23041]: I0308 00:47:57.553646 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t89st\" (UniqueName: \"kubernetes.io/projected/25ac32de-5490-48f3-bccb-f5ce7bfae58c-kube-api-access-t89st\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.553975 master-0 kubenswrapper[23041]: I0308 00:47:57.553916 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-run-udev\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554082 master-0 kubenswrapper[23041]: I0308 00:47:57.554046 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-csi-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554135 master-0 kubenswrapper[23041]: I0308 00:47:57.554100 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-run-udev\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554182 master-0 kubenswrapper[23041]: I0308 00:47:57.554139 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-node-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554300 master-0 kubenswrapper[23041]: I0308 00:47:57.554279 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-lvmd-config\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554352 master-0 kubenswrapper[23041]: I0308 00:47:57.554338 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-file-lock-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554406 master-0 kubenswrapper[23041]: I0308 00:47:57.554387 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-node-plugin-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554406 master-0 kubenswrapper[23041]: I0308 00:47:57.554393 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-pod-volumes-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554491 master-0 kubenswrapper[23041]: I0308 00:47:57.554432 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-lvmd-config\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554524 master-0 kubenswrapper[23041]: I0308 00:47:57.554493 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/25ac32de-5490-48f3-bccb-f5ce7bfae58c-metrics-cert\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554524 master-0 kubenswrapper[23041]: I0308 00:47:57.554436 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-pod-volumes-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554583 master-0 kubenswrapper[23041]: I0308 00:47:57.554516 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-device-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554615 master-0 kubenswrapper[23041]: I0308 00:47:57.554602 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-registration-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554651 master-0 kubenswrapper[23041]: I0308 00:47:57.554637 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-sys\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554762 master-0 kubenswrapper[23041]: I0308 00:47:57.554716 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-registration-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554762 master-0 kubenswrapper[23041]: I0308 00:47:57.554720 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-file-lock-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554843 master-0 kubenswrapper[23041]: I0308 00:47:57.554761 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-sys\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.554843 master-0 kubenswrapper[23041]: I0308 00:47:57.554788 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25ac32de-5490-48f3-bccb-f5ce7bfae58c-device-dir\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.559098 master-0 kubenswrapper[23041]: I0308 00:47:57.559041 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/25ac32de-5490-48f3-bccb-f5ce7bfae58c-metrics-cert\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.575154 master-0 kubenswrapper[23041]: I0308 00:47:57.575071 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t89st\" (UniqueName: \"kubernetes.io/projected/25ac32de-5490-48f3-bccb-f5ce7bfae58c-kube-api-access-t89st\") pod \"vg-manager-l6zlx\" (UID: \"25ac32de-5490-48f3-bccb-f5ce7bfae58c\") " pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:57.638081 master-0 kubenswrapper[23041]: I0308 00:47:57.638011 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:47:58.081511 master-0 kubenswrapper[23041]: W0308 00:47:58.081441 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25ac32de_5490_48f3_bccb_f5ce7bfae58c.slice/crio-a69a0d6b6bf724aa39d6bbe0a36b789188bed9b9a330efd749f959c0fc6f589f WatchSource:0}: Error finding container a69a0d6b6bf724aa39d6bbe0a36b789188bed9b9a330efd749f959c0fc6f589f: Status 404 returned error can't find the container with id a69a0d6b6bf724aa39d6bbe0a36b789188bed9b9a330efd749f959c0fc6f589f Mar 08 00:47:58.086174 master-0 kubenswrapper[23041]: I0308 00:47:58.086079 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-l6zlx"] Mar 08 00:47:58.565059 master-0 kubenswrapper[23041]: I0308 00:47:58.564997 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-l6zlx" event={"ID":"25ac32de-5490-48f3-bccb-f5ce7bfae58c","Type":"ContainerStarted","Data":"406d7abeb145d0db8d0aeea0e275f5c5351b56113dfbb04c75021c3e14807b41"} Mar 08 00:47:58.565059 master-0 kubenswrapper[23041]: I0308 00:47:58.565067 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-l6zlx" event={"ID":"25ac32de-5490-48f3-bccb-f5ce7bfae58c","Type":"ContainerStarted","Data":"a69a0d6b6bf724aa39d6bbe0a36b789188bed9b9a330efd749f959c0fc6f589f"} Mar 08 00:47:58.585979 master-0 kubenswrapper[23041]: I0308 00:47:58.585907 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-l6zlx" podStartSLOduration=1.5858878189999999 podStartE2EDuration="1.585887819s" podCreationTimestamp="2026-03-08 00:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:47:58.582741308 +0000 UTC m=+984.055577872" watchObservedRunningTime="2026-03-08 00:47:58.585887819 +0000 UTC m=+984.058724363" Mar 08 00:48:00.103779 master-0 kubenswrapper[23041]: I0308 00:48:00.103595 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vb6dz" Mar 08 00:48:00.584159 master-0 kubenswrapper[23041]: I0308 00:48:00.584104 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-l6zlx_25ac32de-5490-48f3-bccb-f5ce7bfae58c/vg-manager/0.log" Mar 08 00:48:00.584159 master-0 kubenswrapper[23041]: I0308 00:48:00.584164 23041 generic.go:334] "Generic (PLEG): container finished" podID="25ac32de-5490-48f3-bccb-f5ce7bfae58c" containerID="406d7abeb145d0db8d0aeea0e275f5c5351b56113dfbb04c75021c3e14807b41" exitCode=1 Mar 08 00:48:00.584448 master-0 kubenswrapper[23041]: I0308 00:48:00.584195 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-l6zlx" event={"ID":"25ac32de-5490-48f3-bccb-f5ce7bfae58c","Type":"ContainerDied","Data":"406d7abeb145d0db8d0aeea0e275f5c5351b56113dfbb04c75021c3e14807b41"} Mar 08 00:48:00.584848 master-0 kubenswrapper[23041]: I0308 00:48:00.584820 23041 scope.go:117] "RemoveContainer" containerID="406d7abeb145d0db8d0aeea0e275f5c5351b56113dfbb04c75021c3e14807b41" Mar 08 00:48:00.908104 master-0 kubenswrapper[23041]: I0308 00:48:00.908004 23041 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 08 00:48:00.953436 master-0 kubenswrapper[23041]: I0308 00:48:00.953052 23041 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-08T00:48:00.908035048Z","Handler":null,"Name":""} Mar 08 00:48:00.955116 master-0 kubenswrapper[23041]: I0308 00:48:00.955075 23041 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 08 00:48:00.955116 master-0 kubenswrapper[23041]: I0308 00:48:00.955116 23041 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 08 00:48:01.594637 master-0 kubenswrapper[23041]: I0308 00:48:01.594583 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-l6zlx_25ac32de-5490-48f3-bccb-f5ce7bfae58c/vg-manager/0.log" Mar 08 00:48:01.595130 master-0 kubenswrapper[23041]: I0308 00:48:01.594660 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-l6zlx" event={"ID":"25ac32de-5490-48f3-bccb-f5ce7bfae58c","Type":"ContainerStarted","Data":"527b2481d40ca6d35ac137c4178d3f04c1237aee2e967bbd54e1fed986437c01"} Mar 08 00:48:04.047227 master-0 kubenswrapper[23041]: I0308 00:48:04.046277 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xvcfv"] Mar 08 00:48:04.054030 master-0 kubenswrapper[23041]: I0308 00:48:04.052633 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:04.057630 master-0 kubenswrapper[23041]: I0308 00:48:04.056301 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 00:48:04.057630 master-0 kubenswrapper[23041]: I0308 00:48:04.056504 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 00:48:04.060144 master-0 kubenswrapper[23041]: I0308 00:48:04.060097 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xvcfv"] Mar 08 00:48:04.104827 master-0 kubenswrapper[23041]: I0308 00:48:04.104767 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzcr\" (UniqueName: \"kubernetes.io/projected/0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f-kube-api-access-7fzcr\") pod \"openstack-operator-index-xvcfv\" (UID: \"0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f\") " pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:04.206959 master-0 kubenswrapper[23041]: I0308 00:48:04.206882 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzcr\" (UniqueName: \"kubernetes.io/projected/0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f-kube-api-access-7fzcr\") pod \"openstack-operator-index-xvcfv\" (UID: \"0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f\") " pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:04.235368 master-0 kubenswrapper[23041]: I0308 00:48:04.235309 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzcr\" (UniqueName: \"kubernetes.io/projected/0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f-kube-api-access-7fzcr\") pod \"openstack-operator-index-xvcfv\" (UID: \"0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f\") " pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:04.402830 master-0 kubenswrapper[23041]: I0308 00:48:04.402698 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:04.833673 master-0 kubenswrapper[23041]: I0308 00:48:04.829416 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xvcfv"] Mar 08 00:48:04.834548 master-0 kubenswrapper[23041]: W0308 00:48:04.834503 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdc2143_7c47_4ee3_bab0_f0c60f3bf68f.slice/crio-5d281221076dcd439173076be9113d097f792f83466e5e90a7ecbb0ddc8b18b0 WatchSource:0}: Error finding container 5d281221076dcd439173076be9113d097f792f83466e5e90a7ecbb0ddc8b18b0: Status 404 returned error can't find the container with id 5d281221076dcd439173076be9113d097f792f83466e5e90a7ecbb0ddc8b18b0 Mar 08 00:48:04.836537 master-0 kubenswrapper[23041]: I0308 00:48:04.836509 23041 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:48:05.638660 master-0 kubenswrapper[23041]: I0308 00:48:05.638593 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xvcfv" event={"ID":"0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f","Type":"ContainerStarted","Data":"5d281221076dcd439173076be9113d097f792f83466e5e90a7ecbb0ddc8b18b0"} Mar 08 00:48:07.639103 master-0 kubenswrapper[23041]: I0308 00:48:07.639047 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:48:07.643496 master-0 kubenswrapper[23041]: I0308 00:48:07.643461 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:48:07.666694 master-0 kubenswrapper[23041]: I0308 00:48:07.666663 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:48:07.667735 master-0 kubenswrapper[23041]: I0308 00:48:07.667679 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-l6zlx" Mar 08 00:48:08.581702 master-0 kubenswrapper[23041]: I0308 00:48:08.581626 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" containerID="cri-o://31c1ce47271da149372933b5669e24882a32895d1eabd2caa8d72dcabb6291e0" gracePeriod=15 Mar 08 00:48:09.435000 master-0 kubenswrapper[23041]: I0308 00:48:09.434846 23041 patch_prober.go:28] interesting pod/console-c45bf598-vngbg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Mar 08 00:48:09.435000 master-0 kubenswrapper[23041]: I0308 00:48:09.434921 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-c45bf598-vngbg" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Mar 08 00:48:10.692177 master-0 kubenswrapper[23041]: I0308 00:48:10.691942 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c45bf598-vngbg_4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/console/1.log" Mar 08 00:48:10.787551 master-0 kubenswrapper[23041]: I0308 00:48:10.787443 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c45bf598-vngbg_4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/console/0.log" Mar 08 00:48:10.787551 master-0 kubenswrapper[23041]: I0308 00:48:10.787548 23041 generic.go:334] "Generic (PLEG): container finished" podID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerID="31c1ce47271da149372933b5669e24882a32895d1eabd2caa8d72dcabb6291e0" exitCode=2 Mar 08 00:48:10.788148 master-0 kubenswrapper[23041]: I0308 00:48:10.787664 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerDied","Data":"31c1ce47271da149372933b5669e24882a32895d1eabd2caa8d72dcabb6291e0"} Mar 08 00:48:10.788148 master-0 kubenswrapper[23041]: I0308 00:48:10.787815 23041 scope.go:117] "RemoveContainer" containerID="54cfef26a9a74f2e4d1e1e3bc7b1f428fedbd1ac36e2015bd2fca2afb1817c24" Mar 08 00:48:12.478123 master-0 kubenswrapper[23041]: I0308 00:48:12.478084 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c45bf598-vngbg_4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/console/1.log" Mar 08 00:48:12.480000 master-0 kubenswrapper[23041]: I0308 00:48:12.479954 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:48:12.592104 master-0 kubenswrapper[23041]: I0308 00:48:12.591954 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.592104 master-0 kubenswrapper[23041]: I0308 00:48:12.592004 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9gm8\" (UniqueName: \"kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.592104 master-0 kubenswrapper[23041]: I0308 00:48:12.592040 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.592104 master-0 kubenswrapper[23041]: I0308 00:48:12.592101 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.592104 master-0 kubenswrapper[23041]: I0308 00:48:12.592120 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.593277 master-0 kubenswrapper[23041]: I0308 00:48:12.592608 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.593277 master-0 kubenswrapper[23041]: I0308 00:48:12.592748 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:48:12.593277 master-0 kubenswrapper[23041]: I0308 00:48:12.592965 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config" (OuterVolumeSpecName: "console-config") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:48:12.593277 master-0 kubenswrapper[23041]: I0308 00:48:12.593017 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle\") pod \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\" (UID: \"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac\") " Mar 08 00:48:12.593277 master-0 kubenswrapper[23041]: I0308 00:48:12.593054 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca" (OuterVolumeSpecName: "service-ca") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:48:12.593558 master-0 kubenswrapper[23041]: I0308 00:48:12.593445 23041 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.593558 master-0 kubenswrapper[23041]: I0308 00:48:12.593460 23041 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.593558 master-0 kubenswrapper[23041]: I0308 00:48:12.593472 23041 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.593558 master-0 kubenswrapper[23041]: I0308 00:48:12.593541 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:48:12.596286 master-0 kubenswrapper[23041]: I0308 00:48:12.596250 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:48:12.599778 master-0 kubenswrapper[23041]: I0308 00:48:12.599700 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:48:12.601960 master-0 kubenswrapper[23041]: I0308 00:48:12.601830 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8" (OuterVolumeSpecName: "kube-api-access-c9gm8") pod "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" (UID: "4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac"). InnerVolumeSpecName "kube-api-access-c9gm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:48:12.695217 master-0 kubenswrapper[23041]: I0308 00:48:12.695145 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9gm8\" (UniqueName: \"kubernetes.io/projected/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-kube-api-access-c9gm8\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.695217 master-0 kubenswrapper[23041]: I0308 00:48:12.695196 23041 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.695385 master-0 kubenswrapper[23041]: I0308 00:48:12.695228 23041 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.695385 master-0 kubenswrapper[23041]: I0308 00:48:12.695238 23041 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:12.803233 master-0 kubenswrapper[23041]: I0308 00:48:12.803130 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xvcfv" event={"ID":"0cdc2143-7c47-4ee3-bab0-f0c60f3bf68f","Type":"ContainerStarted","Data":"e90c85128a35394beb7b1099f221af207a9bd1cacb36b335f81bcdf0c83b90a1"} Mar 08 00:48:12.805261 master-0 kubenswrapper[23041]: I0308 00:48:12.805213 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c45bf598-vngbg_4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/console/1.log" Mar 08 00:48:12.805261 master-0 kubenswrapper[23041]: I0308 00:48:12.805252 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c45bf598-vngbg" event={"ID":"4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac","Type":"ContainerDied","Data":"ece44a4b47794be7785e88f5603d60806f3a9b959a0c0021450bc2009700cb87"} Mar 08 00:48:12.805261 master-0 kubenswrapper[23041]: I0308 00:48:12.805275 23041 scope.go:117] "RemoveContainer" containerID="31c1ce47271da149372933b5669e24882a32895d1eabd2caa8d72dcabb6291e0" Mar 08 00:48:12.805474 master-0 kubenswrapper[23041]: I0308 00:48:12.805343 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c45bf598-vngbg" Mar 08 00:48:12.833452 master-0 kubenswrapper[23041]: I0308 00:48:12.833358 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xvcfv" podStartSLOduration=1.203688792 podStartE2EDuration="8.833338097s" podCreationTimestamp="2026-03-08 00:48:04 +0000 UTC" firstStartedPulling="2026-03-08 00:48:04.836463865 +0000 UTC m=+990.309300419" lastFinishedPulling="2026-03-08 00:48:12.46611315 +0000 UTC m=+997.938949724" observedRunningTime="2026-03-08 00:48:12.825753214 +0000 UTC m=+998.298589788" watchObservedRunningTime="2026-03-08 00:48:12.833338097 +0000 UTC m=+998.306174651" Mar 08 00:48:12.882430 master-0 kubenswrapper[23041]: I0308 00:48:12.882351 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:48:12.891173 master-0 kubenswrapper[23041]: I0308 00:48:12.891099 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c45bf598-vngbg"] Mar 08 00:48:14.403687 master-0 kubenswrapper[23041]: I0308 00:48:14.403567 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:14.403687 master-0 kubenswrapper[23041]: I0308 00:48:14.403695 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:14.433460 master-0 kubenswrapper[23041]: I0308 00:48:14.433415 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:14.821806 master-0 kubenswrapper[23041]: I0308 00:48:14.821734 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" path="/var/lib/kubelet/pods/4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac/volumes" Mar 08 00:48:24.436296 master-0 kubenswrapper[23041]: I0308 00:48:24.435270 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xvcfv" Mar 08 00:48:29.620935 master-0 kubenswrapper[23041]: I0308 00:48:29.620853 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84"] Mar 08 00:48:29.621560 master-0 kubenswrapper[23041]: E0308 00:48:29.621339 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.621560 master-0 kubenswrapper[23041]: I0308 00:48:29.621360 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.621560 master-0 kubenswrapper[23041]: E0308 00:48:29.621401 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.621560 master-0 kubenswrapper[23041]: I0308 00:48:29.621410 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.621781 master-0 kubenswrapper[23041]: I0308 00:48:29.621674 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.621781 master-0 kubenswrapper[23041]: I0308 00:48:29.621699 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3dba51-1f0c-4cd0-8280-58b1a50bb0ac" containerName="console" Mar 08 00:48:29.623748 master-0 kubenswrapper[23041]: I0308 00:48:29.623711 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.671338 master-0 kubenswrapper[23041]: I0308 00:48:29.631866 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84"] Mar 08 00:48:29.691121 master-0 kubenswrapper[23041]: I0308 00:48:29.690321 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.691121 master-0 kubenswrapper[23041]: I0308 00:48:29.690471 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.691121 master-0 kubenswrapper[23041]: I0308 00:48:29.690508 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjdf\" (UniqueName: \"kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.792336 master-0 kubenswrapper[23041]: I0308 00:48:29.792229 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.792664 master-0 kubenswrapper[23041]: I0308 00:48:29.792467 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.792664 master-0 kubenswrapper[23041]: I0308 00:48:29.792507 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjdf\" (UniqueName: \"kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.793020 master-0 kubenswrapper[23041]: I0308 00:48:29.792982 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.793179 master-0 kubenswrapper[23041]: I0308 00:48:29.793111 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.808543 master-0 kubenswrapper[23041]: I0308 00:48:29.808476 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjdf\" (UniqueName: \"kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf\") pod \"63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:29.992114 master-0 kubenswrapper[23041]: I0308 00:48:29.991973 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:30.438791 master-0 kubenswrapper[23041]: I0308 00:48:30.438747 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84"] Mar 08 00:48:30.960632 master-0 kubenswrapper[23041]: I0308 00:48:30.960542 23041 generic.go:334] "Generic (PLEG): container finished" podID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerID="3ade1f502001824f9860cc1478818151260aab5366046ed127431b3bc96bd8f2" exitCode=0 Mar 08 00:48:30.961242 master-0 kubenswrapper[23041]: I0308 00:48:30.960669 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" event={"ID":"c6c8e383-4b5a-43e5-a570-c96081738d33","Type":"ContainerDied","Data":"3ade1f502001824f9860cc1478818151260aab5366046ed127431b3bc96bd8f2"} Mar 08 00:48:30.961242 master-0 kubenswrapper[23041]: I0308 00:48:30.960766 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" event={"ID":"c6c8e383-4b5a-43e5-a570-c96081738d33","Type":"ContainerStarted","Data":"82fd0a3928607d1ad84fccd18e3c8ca5804fed9211e70a48c438f9bac92699c9"} Mar 08 00:48:31.973481 master-0 kubenswrapper[23041]: I0308 00:48:31.973396 23041 generic.go:334] "Generic (PLEG): container finished" podID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerID="3e10d6b4b89286eb3912968352db36f494679926b6e4eca2e7ff0f3c164c0cfc" exitCode=0 Mar 08 00:48:31.974274 master-0 kubenswrapper[23041]: I0308 00:48:31.973682 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" event={"ID":"c6c8e383-4b5a-43e5-a570-c96081738d33","Type":"ContainerDied","Data":"3e10d6b4b89286eb3912968352db36f494679926b6e4eca2e7ff0f3c164c0cfc"} Mar 08 00:48:32.984863 master-0 kubenswrapper[23041]: I0308 00:48:32.984796 23041 generic.go:334] "Generic (PLEG): container finished" podID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerID="006c7847f1b3f3909f18f7a43c90f49056da86441e857d6085ed943f5320e7aa" exitCode=0 Mar 08 00:48:32.985600 master-0 kubenswrapper[23041]: I0308 00:48:32.984862 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" event={"ID":"c6c8e383-4b5a-43e5-a570-c96081738d33","Type":"ContainerDied","Data":"006c7847f1b3f3909f18f7a43c90f49056da86441e857d6085ed943f5320e7aa"} Mar 08 00:48:34.388992 master-0 kubenswrapper[23041]: I0308 00:48:34.388938 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:34.442910 master-0 kubenswrapper[23041]: I0308 00:48:34.442847 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle\") pod \"c6c8e383-4b5a-43e5-a570-c96081738d33\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " Mar 08 00:48:34.443156 master-0 kubenswrapper[23041]: I0308 00:48:34.442951 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util\") pod \"c6c8e383-4b5a-43e5-a570-c96081738d33\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " Mar 08 00:48:34.443156 master-0 kubenswrapper[23041]: I0308 00:48:34.443077 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjdf\" (UniqueName: \"kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf\") pod \"c6c8e383-4b5a-43e5-a570-c96081738d33\" (UID: \"c6c8e383-4b5a-43e5-a570-c96081738d33\") " Mar 08 00:48:34.443583 master-0 kubenswrapper[23041]: I0308 00:48:34.443546 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle" (OuterVolumeSpecName: "bundle") pod "c6c8e383-4b5a-43e5-a570-c96081738d33" (UID: "c6c8e383-4b5a-43e5-a570-c96081738d33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:48:34.446531 master-0 kubenswrapper[23041]: I0308 00:48:34.446499 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf" (OuterVolumeSpecName: "kube-api-access-mdjdf") pod "c6c8e383-4b5a-43e5-a570-c96081738d33" (UID: "c6c8e383-4b5a-43e5-a570-c96081738d33"). InnerVolumeSpecName "kube-api-access-mdjdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:48:34.457509 master-0 kubenswrapper[23041]: I0308 00:48:34.457455 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util" (OuterVolumeSpecName: "util") pod "c6c8e383-4b5a-43e5-a570-c96081738d33" (UID: "c6c8e383-4b5a-43e5-a570-c96081738d33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:48:34.544906 master-0 kubenswrapper[23041]: I0308 00:48:34.544814 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjdf\" (UniqueName: \"kubernetes.io/projected/c6c8e383-4b5a-43e5-a570-c96081738d33-kube-api-access-mdjdf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:34.544906 master-0 kubenswrapper[23041]: I0308 00:48:34.544876 23041 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:34.544906 master-0 kubenswrapper[23041]: I0308 00:48:34.544886 23041 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c8e383-4b5a-43e5-a570-c96081738d33-util\") on node \"master-0\" DevicePath \"\"" Mar 08 00:48:35.016431 master-0 kubenswrapper[23041]: I0308 00:48:35.016352 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" event={"ID":"c6c8e383-4b5a-43e5-a570-c96081738d33","Type":"ContainerDied","Data":"82fd0a3928607d1ad84fccd18e3c8ca5804fed9211e70a48c438f9bac92699c9"} Mar 08 00:48:35.016431 master-0 kubenswrapper[23041]: I0308 00:48:35.016408 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82fd0a3928607d1ad84fccd18e3c8ca5804fed9211e70a48c438f9bac92699c9" Mar 08 00:48:35.016431 master-0 kubenswrapper[23041]: I0308 00:48:35.016422 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/63c289b49d1df002e9410bfc78c42c1a81fdac5ac0156ab656e2a123e5sbm84" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.289399 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk"] Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: E0308 00:48:42.289826 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="extract" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.289843 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="extract" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: E0308 00:48:42.289872 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="util" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.289879 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="util" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: E0308 00:48:42.289915 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="pull" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.289926 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="pull" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.290154 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c8e383-4b5a-43e5-a570-c96081738d33" containerName="extract" Mar 08 00:48:42.293222 master-0 kubenswrapper[23041]: I0308 00:48:42.290850 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:42.318222 master-0 kubenswrapper[23041]: I0308 00:48:42.318094 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk"] Mar 08 00:48:42.390225 master-0 kubenswrapper[23041]: I0308 00:48:42.389362 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/eb5971f7-ac5e-419e-85fe-c3258db80f7b-kube-api-access-74f7r\") pod \"openstack-operator-controller-init-5748c74587-hx2qk\" (UID: \"eb5971f7-ac5e-419e-85fe-c3258db80f7b\") " pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:42.502521 master-0 kubenswrapper[23041]: I0308 00:48:42.501963 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/eb5971f7-ac5e-419e-85fe-c3258db80f7b-kube-api-access-74f7r\") pod \"openstack-operator-controller-init-5748c74587-hx2qk\" (UID: \"eb5971f7-ac5e-419e-85fe-c3258db80f7b\") " pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:42.537978 master-0 kubenswrapper[23041]: I0308 00:48:42.537918 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/eb5971f7-ac5e-419e-85fe-c3258db80f7b-kube-api-access-74f7r\") pod \"openstack-operator-controller-init-5748c74587-hx2qk\" (UID: \"eb5971f7-ac5e-419e-85fe-c3258db80f7b\") " pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:42.670293 master-0 kubenswrapper[23041]: I0308 00:48:42.669451 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:43.170889 master-0 kubenswrapper[23041]: I0308 00:48:43.170436 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk"] Mar 08 00:48:44.096036 master-0 kubenswrapper[23041]: I0308 00:48:44.095975 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" event={"ID":"eb5971f7-ac5e-419e-85fe-c3258db80f7b","Type":"ContainerStarted","Data":"a4d2422e17c014daa5e1ce2476d6aaa427c10d2fd18c464bd8450a24927cf6ee"} Mar 08 00:48:49.148432 master-0 kubenswrapper[23041]: I0308 00:48:49.148360 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" event={"ID":"eb5971f7-ac5e-419e-85fe-c3258db80f7b","Type":"ContainerStarted","Data":"01c58fb496a942ad587d4dbbd0175239bb6cc18ee35e3bf691f3147338690d72"} Mar 08 00:48:49.149122 master-0 kubenswrapper[23041]: I0308 00:48:49.148486 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:48:49.202813 master-0 kubenswrapper[23041]: I0308 00:48:49.202722 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" podStartSLOduration=2.187521937 podStartE2EDuration="7.202701433s" podCreationTimestamp="2026-03-08 00:48:42 +0000 UTC" firstStartedPulling="2026-03-08 00:48:43.171015519 +0000 UTC m=+1028.643852073" lastFinishedPulling="2026-03-08 00:48:48.186195015 +0000 UTC m=+1033.659031569" observedRunningTime="2026-03-08 00:48:49.20132156 +0000 UTC m=+1034.674158124" watchObservedRunningTime="2026-03-08 00:48:49.202701433 +0000 UTC m=+1034.675538007" Mar 08 00:49:02.675799 master-0 kubenswrapper[23041]: I0308 00:49:02.675663 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5748c74587-hx2qk" Mar 08 00:49:23.181107 master-0 kubenswrapper[23041]: I0308 00:49:23.180279 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4"] Mar 08 00:49:23.181953 master-0 kubenswrapper[23041]: I0308 00:49:23.181560 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:23.186474 master-0 kubenswrapper[23041]: I0308 00:49:23.186395 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw"] Mar 08 00:49:23.189303 master-0 kubenswrapper[23041]: I0308 00:49:23.189266 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:23.226843 master-0 kubenswrapper[23041]: I0308 00:49:23.225831 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw"] Mar 08 00:49:23.245172 master-0 kubenswrapper[23041]: I0308 00:49:23.243771 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6"] Mar 08 00:49:23.253412 master-0 kubenswrapper[23041]: I0308 00:49:23.247835 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:23.279793 master-0 kubenswrapper[23041]: I0308 00:49:23.279107 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4"] Mar 08 00:49:23.299885 master-0 kubenswrapper[23041]: I0308 00:49:23.298422 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ss2\" (UniqueName: \"kubernetes.io/projected/084c05b0-52bc-45a5-99b6-cf3a01714e49-kube-api-access-n4ss2\") pod \"cinder-operator-controller-manager-55d77d7b5c-rrvfw\" (UID: \"084c05b0-52bc-45a5-99b6-cf3a01714e49\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:23.299885 master-0 kubenswrapper[23041]: I0308 00:49:23.298691 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjmc\" (UniqueName: \"kubernetes.io/projected/faa673d5-160f-478d-9200-147f13e70e65-kube-api-access-mcjmc\") pod \"barbican-operator-controller-manager-6db6876945-j9sh4\" (UID: \"faa673d5-160f-478d-9200-147f13e70e65\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:23.364456 master-0 kubenswrapper[23041]: I0308 00:49:23.363911 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6"] Mar 08 00:49:23.403764 master-0 kubenswrapper[23041]: I0308 00:49:23.403466 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2p9d\" (UniqueName: \"kubernetes.io/projected/814a072f-7dc0-4158-b22f-3cf285afbb10-kube-api-access-d2p9d\") pod \"designate-operator-controller-manager-5d87c9d997-5hmg6\" (UID: \"814a072f-7dc0-4158-b22f-3cf285afbb10\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:23.403764 master-0 kubenswrapper[23041]: I0308 00:49:23.403571 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ss2\" (UniqueName: \"kubernetes.io/projected/084c05b0-52bc-45a5-99b6-cf3a01714e49-kube-api-access-n4ss2\") pod \"cinder-operator-controller-manager-55d77d7b5c-rrvfw\" (UID: \"084c05b0-52bc-45a5-99b6-cf3a01714e49\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:23.403764 master-0 kubenswrapper[23041]: I0308 00:49:23.403624 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjmc\" (UniqueName: \"kubernetes.io/projected/faa673d5-160f-478d-9200-147f13e70e65-kube-api-access-mcjmc\") pod \"barbican-operator-controller-manager-6db6876945-j9sh4\" (UID: \"faa673d5-160f-478d-9200-147f13e70e65\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:23.419396 master-0 kubenswrapper[23041]: I0308 00:49:23.419295 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz"] Mar 08 00:49:23.428271 master-0 kubenswrapper[23041]: I0308 00:49:23.426814 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:23.445935 master-0 kubenswrapper[23041]: I0308 00:49:23.445753 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjmc\" (UniqueName: \"kubernetes.io/projected/faa673d5-160f-478d-9200-147f13e70e65-kube-api-access-mcjmc\") pod \"barbican-operator-controller-manager-6db6876945-j9sh4\" (UID: \"faa673d5-160f-478d-9200-147f13e70e65\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:23.451944 master-0 kubenswrapper[23041]: I0308 00:49:23.448658 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ss2\" (UniqueName: \"kubernetes.io/projected/084c05b0-52bc-45a5-99b6-cf3a01714e49-kube-api-access-n4ss2\") pod \"cinder-operator-controller-manager-55d77d7b5c-rrvfw\" (UID: \"084c05b0-52bc-45a5-99b6-cf3a01714e49\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:23.457769 master-0 kubenswrapper[23041]: I0308 00:49:23.457711 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz"] Mar 08 00:49:23.492521 master-0 kubenswrapper[23041]: I0308 00:49:23.492194 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv"] Mar 08 00:49:23.499256 master-0 kubenswrapper[23041]: I0308 00:49:23.493614 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:23.519493 master-0 kubenswrapper[23041]: I0308 00:49:23.519188 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:23.525659 master-0 kubenswrapper[23041]: I0308 00:49:23.523645 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28nqh\" (UniqueName: \"kubernetes.io/projected/7a2905e2-df5b-4d3e-811a-f0517a0f7ef8-kube-api-access-28nqh\") pod \"glance-operator-controller-manager-64db6967f8-9tzwz\" (UID: \"7a2905e2-df5b-4d3e-811a-f0517a0f7ef8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:23.525659 master-0 kubenswrapper[23041]: I0308 00:49:23.523875 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2p9d\" (UniqueName: \"kubernetes.io/projected/814a072f-7dc0-4158-b22f-3cf285afbb10-kube-api-access-d2p9d\") pod \"designate-operator-controller-manager-5d87c9d997-5hmg6\" (UID: \"814a072f-7dc0-4158-b22f-3cf285afbb10\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:23.525659 master-0 kubenswrapper[23041]: I0308 00:49:23.524302 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhgb\" (UniqueName: \"kubernetes.io/projected/bbbe59bf-ba15-4378-bc11-5883dd1cd33a-kube-api-access-hfhgb\") pod \"heat-operator-controller-manager-cf99c678f-8wjbv\" (UID: \"bbbe59bf-ba15-4378-bc11-5883dd1cd33a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:23.541468 master-0 kubenswrapper[23041]: I0308 00:49:23.530339 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:23.588676 master-0 kubenswrapper[23041]: I0308 00:49:23.588619 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2p9d\" (UniqueName: \"kubernetes.io/projected/814a072f-7dc0-4158-b22f-3cf285afbb10-kube-api-access-d2p9d\") pod \"designate-operator-controller-manager-5d87c9d997-5hmg6\" (UID: \"814a072f-7dc0-4158-b22f-3cf285afbb10\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:23.588676 master-0 kubenswrapper[23041]: I0308 00:49:23.588628 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv"] Mar 08 00:49:23.612266 master-0 kubenswrapper[23041]: I0308 00:49:23.611333 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz"] Mar 08 00:49:23.616304 master-0 kubenswrapper[23041]: I0308 00:49:23.616227 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj"] Mar 08 00:49:23.617620 master-0 kubenswrapper[23041]: I0308 00:49:23.617588 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.617947 master-0 kubenswrapper[23041]: I0308 00:49:23.617585 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:23.628861 master-0 kubenswrapper[23041]: I0308 00:49:23.627782 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 00:49:23.633782 master-0 kubenswrapper[23041]: I0308 00:49:23.633712 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28nqh\" (UniqueName: \"kubernetes.io/projected/7a2905e2-df5b-4d3e-811a-f0517a0f7ef8-kube-api-access-28nqh\") pod \"glance-operator-controller-manager-64db6967f8-9tzwz\" (UID: \"7a2905e2-df5b-4d3e-811a-f0517a0f7ef8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:23.634045 master-0 kubenswrapper[23041]: I0308 00:49:23.633995 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhgb\" (UniqueName: \"kubernetes.io/projected/bbbe59bf-ba15-4378-bc11-5883dd1cd33a-kube-api-access-hfhgb\") pod \"heat-operator-controller-manager-cf99c678f-8wjbv\" (UID: \"bbbe59bf-ba15-4378-bc11-5883dd1cd33a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:23.659322 master-0 kubenswrapper[23041]: I0308 00:49:23.658742 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz"] Mar 08 00:49:23.689413 master-0 kubenswrapper[23041]: I0308 00:49:23.689070 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28nqh\" (UniqueName: \"kubernetes.io/projected/7a2905e2-df5b-4d3e-811a-f0517a0f7ef8-kube-api-access-28nqh\") pod \"glance-operator-controller-manager-64db6967f8-9tzwz\" (UID: \"7a2905e2-df5b-4d3e-811a-f0517a0f7ef8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:23.695802 master-0 kubenswrapper[23041]: I0308 00:49:23.695700 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhgb\" (UniqueName: \"kubernetes.io/projected/bbbe59bf-ba15-4378-bc11-5883dd1cd33a-kube-api-access-hfhgb\") pod \"heat-operator-controller-manager-cf99c678f-8wjbv\" (UID: \"bbbe59bf-ba15-4378-bc11-5883dd1cd33a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:23.712624 master-0 kubenswrapper[23041]: I0308 00:49:23.712297 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj"] Mar 08 00:49:23.738179 master-0 kubenswrapper[23041]: I0308 00:49:23.738067 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlpg\" (UniqueName: \"kubernetes.io/projected/1436364c-fa7f-4858-a6cf-c635fe431773-kube-api-access-7mlpg\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.738179 master-0 kubenswrapper[23041]: I0308 00:49:23.738132 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.753877 master-0 kubenswrapper[23041]: I0308 00:49:23.738302 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2z4\" (UniqueName: \"kubernetes.io/projected/aaf0aa97-41c4-488a-b5a7-1b72b777a70c-kube-api-access-fj2z4\") pod \"horizon-operator-controller-manager-78bc7f9bd9-wq5pz\" (UID: \"aaf0aa97-41c4-488a-b5a7-1b72b777a70c\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:23.774955 master-0 kubenswrapper[23041]: I0308 00:49:23.774905 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-686765764-jhdvn"] Mar 08 00:49:23.781001 master-0 kubenswrapper[23041]: I0308 00:49:23.780964 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:23.825843 master-0 kubenswrapper[23041]: I0308 00:49:23.818475 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:23.829120 master-0 kubenswrapper[23041]: I0308 00:49:23.829065 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2"] Mar 08 00:49:23.830916 master-0 kubenswrapper[23041]: I0308 00:49:23.830864 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:23.882285 master-0 kubenswrapper[23041]: I0308 00:49:23.882097 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlpg\" (UniqueName: \"kubernetes.io/projected/1436364c-fa7f-4858-a6cf-c635fe431773-kube-api-access-7mlpg\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.882285 master-0 kubenswrapper[23041]: I0308 00:49:23.882193 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtkvj\" (UniqueName: \"kubernetes.io/projected/759c181c-c143-4285-8530-0936f249b97d-kube-api-access-dtkvj\") pod \"ironic-operator-controller-manager-686765764-jhdvn\" (UID: \"759c181c-c143-4285-8530-0936f249b97d\") " pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:23.882611 master-0 kubenswrapper[23041]: I0308 00:49:23.882320 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.882611 master-0 kubenswrapper[23041]: E0308 00:49:23.882451 23041 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:23.882611 master-0 kubenswrapper[23041]: E0308 00:49:23.882530 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert podName:1436364c-fa7f-4858-a6cf-c635fe431773 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:24.382502825 +0000 UTC m=+1069.855339379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert") pod "infra-operator-controller-manager-b8c8d7cc8-bhcdj" (UID: "1436364c-fa7f-4858-a6cf-c635fe431773") : secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:23.882791 master-0 kubenswrapper[23041]: I0308 00:49:23.882700 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2z4\" (UniqueName: \"kubernetes.io/projected/aaf0aa97-41c4-488a-b5a7-1b72b777a70c-kube-api-access-fj2z4\") pod \"horizon-operator-controller-manager-78bc7f9bd9-wq5pz\" (UID: \"aaf0aa97-41c4-488a-b5a7-1b72b777a70c\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:23.886198 master-0 kubenswrapper[23041]: I0308 00:49:23.885680 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:23.901453 master-0 kubenswrapper[23041]: I0308 00:49:23.900883 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-686765764-jhdvn"] Mar 08 00:49:23.942855 master-0 kubenswrapper[23041]: I0308 00:49:23.942765 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2"] Mar 08 00:49:23.945816 master-0 kubenswrapper[23041]: I0308 00:49:23.945422 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2z4\" (UniqueName: \"kubernetes.io/projected/aaf0aa97-41c4-488a-b5a7-1b72b777a70c-kube-api-access-fj2z4\") pod \"horizon-operator-controller-manager-78bc7f9bd9-wq5pz\" (UID: \"aaf0aa97-41c4-488a-b5a7-1b72b777a70c\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:23.954092 master-0 kubenswrapper[23041]: I0308 00:49:23.953989 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlpg\" (UniqueName: \"kubernetes.io/projected/1436364c-fa7f-4858-a6cf-c635fe431773-kube-api-access-7mlpg\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:23.968729 master-0 kubenswrapper[23041]: I0308 00:49:23.968674 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:23.983823 master-0 kubenswrapper[23041]: I0308 00:49:23.981274 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2"] Mar 08 00:49:23.983823 master-0 kubenswrapper[23041]: I0308 00:49:23.982530 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:23.983823 master-0 kubenswrapper[23041]: I0308 00:49:23.983722 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtkvj\" (UniqueName: \"kubernetes.io/projected/759c181c-c143-4285-8530-0936f249b97d-kube-api-access-dtkvj\") pod \"ironic-operator-controller-manager-686765764-jhdvn\" (UID: \"759c181c-c143-4285-8530-0936f249b97d\") " pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:23.983823 master-0 kubenswrapper[23041]: I0308 00:49:23.983804 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgjz\" (UniqueName: \"kubernetes.io/projected/70b24627-00b3-4e46-b5a9-630ae001f9d6-kube-api-access-dvgjz\") pod \"keystone-operator-controller-manager-7c789f89c6-jrqk2\" (UID: \"70b24627-00b3-4e46-b5a9-630ae001f9d6\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:24.012597 master-0 kubenswrapper[23041]: I0308 00:49:24.008025 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t"] Mar 08 00:49:24.012597 master-0 kubenswrapper[23041]: I0308 00:49:24.011910 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:24.068238 master-0 kubenswrapper[23041]: I0308 00:49:24.042285 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtkvj\" (UniqueName: \"kubernetes.io/projected/759c181c-c143-4285-8530-0936f249b97d-kube-api-access-dtkvj\") pod \"ironic-operator-controller-manager-686765764-jhdvn\" (UID: \"759c181c-c143-4285-8530-0936f249b97d\") " pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:24.068238 master-0 kubenswrapper[23041]: I0308 00:49:24.043293 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2"] Mar 08 00:49:24.068238 master-0 kubenswrapper[23041]: I0308 00:49:24.058514 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t"] Mar 08 00:49:24.087541 master-0 kubenswrapper[23041]: I0308 00:49:24.087419 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgjz\" (UniqueName: \"kubernetes.io/projected/70b24627-00b3-4e46-b5a9-630ae001f9d6-kube-api-access-dvgjz\") pod \"keystone-operator-controller-manager-7c789f89c6-jrqk2\" (UID: \"70b24627-00b3-4e46-b5a9-630ae001f9d6\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:24.087541 master-0 kubenswrapper[23041]: I0308 00:49:24.087538 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74fc\" (UniqueName: \"kubernetes.io/projected/493e7565-9522-49f3-aa7d-53f3808c241c-kube-api-access-z74fc\") pod \"manila-operator-controller-manager-67d996989d-xwlf2\" (UID: \"493e7565-9522-49f3-aa7d-53f3808c241c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:24.087719 master-0 kubenswrapper[23041]: I0308 00:49:24.087590 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4csf\" (UniqueName: \"kubernetes.io/projected/a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9-kube-api-access-l4csf\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l4w8t\" (UID: \"a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:24.094683 master-0 kubenswrapper[23041]: I0308 00:49:24.094282 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp"] Mar 08 00:49:24.095819 master-0 kubenswrapper[23041]: I0308 00:49:24.095506 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:24.099366 master-0 kubenswrapper[23041]: I0308 00:49:24.099261 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:24.148546 master-0 kubenswrapper[23041]: I0308 00:49:24.145254 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgjz\" (UniqueName: \"kubernetes.io/projected/70b24627-00b3-4e46-b5a9-630ae001f9d6-kube-api-access-dvgjz\") pod \"keystone-operator-controller-manager-7c789f89c6-jrqk2\" (UID: \"70b24627-00b3-4e46-b5a9-630ae001f9d6\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:24.148546 master-0 kubenswrapper[23041]: I0308 00:49:24.146122 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:24.177867 master-0 kubenswrapper[23041]: I0308 00:49:24.177173 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:24.193724 master-0 kubenswrapper[23041]: I0308 00:49:24.191107 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74fc\" (UniqueName: \"kubernetes.io/projected/493e7565-9522-49f3-aa7d-53f3808c241c-kube-api-access-z74fc\") pod \"manila-operator-controller-manager-67d996989d-xwlf2\" (UID: \"493e7565-9522-49f3-aa7d-53f3808c241c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:24.193724 master-0 kubenswrapper[23041]: I0308 00:49:24.191521 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4csf\" (UniqueName: \"kubernetes.io/projected/a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9-kube-api-access-l4csf\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l4w8t\" (UID: \"a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:24.193724 master-0 kubenswrapper[23041]: I0308 00:49:24.191617 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmxcr\" (UniqueName: \"kubernetes.io/projected/d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b-kube-api-access-vmxcr\") pod \"neutron-operator-controller-manager-54688575f-b6ldp\" (UID: \"d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:24.228597 master-0 kubenswrapper[23041]: I0308 00:49:24.228541 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp"] Mar 08 00:49:24.233883 master-0 kubenswrapper[23041]: I0308 00:49:24.233821 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr"] Mar 08 00:49:24.235746 master-0 kubenswrapper[23041]: I0308 00:49:24.235708 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:24.245730 master-0 kubenswrapper[23041]: I0308 00:49:24.245698 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74fc\" (UniqueName: \"kubernetes.io/projected/493e7565-9522-49f3-aa7d-53f3808c241c-kube-api-access-z74fc\") pod \"manila-operator-controller-manager-67d996989d-xwlf2\" (UID: \"493e7565-9522-49f3-aa7d-53f3808c241c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:24.246907 master-0 kubenswrapper[23041]: I0308 00:49:24.246612 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4csf\" (UniqueName: \"kubernetes.io/projected/a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9-kube-api-access-l4csf\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l4w8t\" (UID: \"a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:24.254648 master-0 kubenswrapper[23041]: I0308 00:49:24.254612 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b"] Mar 08 00:49:24.260054 master-0 kubenswrapper[23041]: I0308 00:49:24.256195 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:24.265064 master-0 kubenswrapper[23041]: I0308 00:49:24.265003 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr"] Mar 08 00:49:24.294112 master-0 kubenswrapper[23041]: I0308 00:49:24.294049 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmxcr\" (UniqueName: \"kubernetes.io/projected/d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b-kube-api-access-vmxcr\") pod \"neutron-operator-controller-manager-54688575f-b6ldp\" (UID: \"d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:24.350908 master-0 kubenswrapper[23041]: I0308 00:49:24.346031 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:24.353364 master-0 kubenswrapper[23041]: I0308 00:49:24.351513 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b"] Mar 08 00:49:24.362436 master-0 kubenswrapper[23041]: I0308 00:49:24.362313 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmxcr\" (UniqueName: \"kubernetes.io/projected/d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b-kube-api-access-vmxcr\") pod \"neutron-operator-controller-manager-54688575f-b6ldp\" (UID: \"d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:24.382185 master-0 kubenswrapper[23041]: I0308 00:49:24.380048 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk"] Mar 08 00:49:24.382185 master-0 kubenswrapper[23041]: I0308 00:49:24.381684 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.384049 master-0 kubenswrapper[23041]: I0308 00:49:24.383991 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:24.386091 master-0 kubenswrapper[23041]: I0308 00:49:24.386062 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 00:49:24.386246 master-0 kubenswrapper[23041]: I0308 00:49:24.386221 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6"] Mar 08 00:49:24.390529 master-0 kubenswrapper[23041]: I0308 00:49:24.387714 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: I0308 00:49:24.402249 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqcg\" (UniqueName: \"kubernetes.io/projected/3840c315-d22c-4609-8f86-f905cb6895e9-kube-api-access-wjqcg\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xgs7b\" (UID: \"3840c315-d22c-4609-8f86-f905cb6895e9\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: I0308 00:49:24.402338 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52r5\" (UniqueName: \"kubernetes.io/projected/a75721b7-d12b-4337-a9f2-1c68ed0225b1-kube-api-access-n52r5\") pod \"nova-operator-controller-manager-74b6b5dc96-vbkgr\" (UID: \"a75721b7-d12b-4337-a9f2-1c68ed0225b1\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: I0308 00:49:24.402412 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: E0308 00:49:24.402539 23041 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: E0308 00:49:24.402596 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert podName:1436364c-fa7f-4858-a6cf-c635fe431773 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:25.402577586 +0000 UTC m=+1070.875414140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert") pod "infra-operator-controller-manager-b8c8d7cc8-bhcdj" (UID: "1436364c-fa7f-4858-a6cf-c635fe431773") : secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:24.406076 master-0 kubenswrapper[23041]: I0308 00:49:24.405868 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk"] Mar 08 00:49:24.426828 master-0 kubenswrapper[23041]: W0308 00:49:24.426431 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa673d5_160f_478d_9200_147f13e70e65.slice/crio-303a5479efcc5adbe0edaabbecb2795ca400895d3cd0745f7c9dea32d3a941b6 WatchSource:0}: Error finding container 303a5479efcc5adbe0edaabbecb2795ca400895d3cd0745f7c9dea32d3a941b6: Status 404 returned error can't find the container with id 303a5479efcc5adbe0edaabbecb2795ca400895d3cd0745f7c9dea32d3a941b6 Mar 08 00:49:24.441508 master-0 kubenswrapper[23041]: I0308 00:49:24.433324 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:24.441508 master-0 kubenswrapper[23041]: I0308 00:49:24.433652 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6"] Mar 08 00:49:24.463686 master-0 kubenswrapper[23041]: I0308 00:49:24.451036 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7"] Mar 08 00:49:24.463686 master-0 kubenswrapper[23041]: I0308 00:49:24.454824 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:24.496967 master-0 kubenswrapper[23041]: I0308 00:49:24.496905 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7"] Mar 08 00:49:24.504057 master-0 kubenswrapper[23041]: I0308 00:49:24.504002 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqcg\" (UniqueName: \"kubernetes.io/projected/3840c315-d22c-4609-8f86-f905cb6895e9-kube-api-access-wjqcg\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xgs7b\" (UID: \"3840c315-d22c-4609-8f86-f905cb6895e9\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:24.504633 master-0 kubenswrapper[23041]: I0308 00:49:24.504574 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52r5\" (UniqueName: \"kubernetes.io/projected/a75721b7-d12b-4337-a9f2-1c68ed0225b1-kube-api-access-n52r5\") pod \"nova-operator-controller-manager-74b6b5dc96-vbkgr\" (UID: \"a75721b7-d12b-4337-a9f2-1c68ed0225b1\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:24.511482 master-0 kubenswrapper[23041]: I0308 00:49:24.504755 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njplq\" (UniqueName: \"kubernetes.io/projected/add7710b-ad19-4be8-b7fe-77e7107961d3-kube-api-access-njplq\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.511482 master-0 kubenswrapper[23041]: I0308 00:49:24.504789 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.511482 master-0 kubenswrapper[23041]: I0308 00:49:24.504841 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qvv\" (UniqueName: \"kubernetes.io/projected/72ae432b-fa05-4df8-a7f4-43c7ea765691-kube-api-access-65qvv\") pod \"ovn-operator-controller-manager-75684d597f-nkfd6\" (UID: \"72ae432b-fa05-4df8-a7f4-43c7ea765691\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:24.511482 master-0 kubenswrapper[23041]: I0308 00:49:24.504954 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk"] Mar 08 00:49:24.511482 master-0 kubenswrapper[23041]: I0308 00:49:24.506276 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:24.513799 master-0 kubenswrapper[23041]: I0308 00:49:24.513710 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb"] Mar 08 00:49:24.516304 master-0 kubenswrapper[23041]: I0308 00:49:24.516200 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:24.520488 master-0 kubenswrapper[23041]: I0308 00:49:24.520429 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk"] Mar 08 00:49:24.526840 master-0 kubenswrapper[23041]: I0308 00:49:24.526794 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb"] Mar 08 00:49:24.534801 master-0 kubenswrapper[23041]: I0308 00:49:24.534482 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52r5\" (UniqueName: \"kubernetes.io/projected/a75721b7-d12b-4337-a9f2-1c68ed0225b1-kube-api-access-n52r5\") pod \"nova-operator-controller-manager-74b6b5dc96-vbkgr\" (UID: \"a75721b7-d12b-4337-a9f2-1c68ed0225b1\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:24.536774 master-0 kubenswrapper[23041]: I0308 00:49:24.535291 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqcg\" (UniqueName: \"kubernetes.io/projected/3840c315-d22c-4609-8f86-f905cb6895e9-kube-api-access-wjqcg\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xgs7b\" (UID: \"3840c315-d22c-4609-8f86-f905cb6895e9\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:24.536774 master-0 kubenswrapper[23041]: I0308 00:49:24.535336 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p"] Mar 08 00:49:24.537213 master-0 kubenswrapper[23041]: I0308 00:49:24.537060 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:24.563344 master-0 kubenswrapper[23041]: I0308 00:49:24.560248 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p"] Mar 08 00:49:24.584507 master-0 kubenswrapper[23041]: I0308 00:49:24.582765 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4"] Mar 08 00:49:24.585230 master-0 kubenswrapper[23041]: I0308 00:49:24.584747 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.595531 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4"] Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.601598 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615331 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ntr\" (UniqueName: \"kubernetes.io/projected/7abab89c-1b33-4b90-896e-ae546a94572d-kube-api-access-z4ntr\") pod \"telemetry-operator-controller-manager-5fdb694969-sqmsb\" (UID: \"7abab89c-1b33-4b90-896e-ae546a94572d\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615396 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhq8t\" (UniqueName: \"kubernetes.io/projected/11fd0e3e-abd4-4c9a-b48b-b1876062d035-kube-api-access-fhq8t\") pod \"placement-operator-controller-manager-648564c9fc-qlhb7\" (UID: \"11fd0e3e-abd4-4c9a-b48b-b1876062d035\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615432 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghb28\" (UniqueName: \"kubernetes.io/projected/c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7-kube-api-access-ghb28\") pod \"test-operator-controller-manager-55b5ff4dbb-56p6p\" (UID: \"c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615463 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njplq\" (UniqueName: \"kubernetes.io/projected/add7710b-ad19-4be8-b7fe-77e7107961d3-kube-api-access-njplq\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615484 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615517 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl59w\" (UniqueName: \"kubernetes.io/projected/146c7192-f416-4f39-8e9b-3b14da8545f7-kube-api-access-jl59w\") pod \"swift-operator-controller-manager-9b9ff9f4d-n6mwk\" (UID: \"146c7192-f416-4f39-8e9b-3b14da8545f7\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.615554 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qvv\" (UniqueName: \"kubernetes.io/projected/72ae432b-fa05-4df8-a7f4-43c7ea765691-kube-api-access-65qvv\") pod \"ovn-operator-controller-manager-75684d597f-nkfd6\" (UID: \"72ae432b-fa05-4df8-a7f4-43c7ea765691\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: E0308 00:49:24.616090 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: E0308 00:49:24.616132 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:25.116117313 +0000 UTC m=+1070.588953867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.623772 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" event={"ID":"faa673d5-160f-478d-9200-147f13e70e65","Type":"ContainerStarted","Data":"303a5479efcc5adbe0edaabbecb2795ca400895d3cd0745f7c9dea32d3a941b6"} Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.627269 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk"] Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.629062 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.632921 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 00:49:24.638648 master-0 kubenswrapper[23041]: I0308 00:49:24.633113 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 00:49:24.641539 master-0 kubenswrapper[23041]: I0308 00:49:24.641511 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qvv\" (UniqueName: \"kubernetes.io/projected/72ae432b-fa05-4df8-a7f4-43c7ea765691-kube-api-access-65qvv\") pod \"ovn-operator-controller-manager-75684d597f-nkfd6\" (UID: \"72ae432b-fa05-4df8-a7f4-43c7ea765691\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:24.641905 master-0 kubenswrapper[23041]: I0308 00:49:24.641796 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:24.655312 master-0 kubenswrapper[23041]: I0308 00:49:24.655248 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njplq\" (UniqueName: \"kubernetes.io/projected/add7710b-ad19-4be8-b7fe-77e7107961d3-kube-api-access-njplq\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:24.655425 master-0 kubenswrapper[23041]: I0308 00:49:24.655353 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk"] Mar 08 00:49:24.703848 master-0 kubenswrapper[23041]: I0308 00:49:24.703105 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7"] Mar 08 00:49:24.707117 master-0 kubenswrapper[23041]: I0308 00:49:24.707078 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717330 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ntr\" (UniqueName: \"kubernetes.io/projected/7abab89c-1b33-4b90-896e-ae546a94572d-kube-api-access-z4ntr\") pod \"telemetry-operator-controller-manager-5fdb694969-sqmsb\" (UID: \"7abab89c-1b33-4b90-896e-ae546a94572d\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717389 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhq8t\" (UniqueName: \"kubernetes.io/projected/11fd0e3e-abd4-4c9a-b48b-b1876062d035-kube-api-access-fhq8t\") pod \"placement-operator-controller-manager-648564c9fc-qlhb7\" (UID: \"11fd0e3e-abd4-4c9a-b48b-b1876062d035\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717414 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gblzv\" (UniqueName: \"kubernetes.io/projected/8a999920-adc2-499a-a349-4d0405263ee2-kube-api-access-gblzv\") pod \"watcher-operator-controller-manager-bccc79885-zvtv4\" (UID: \"8a999920-adc2-499a-a349-4d0405263ee2\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717450 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghb28\" (UniqueName: \"kubernetes.io/projected/c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7-kube-api-access-ghb28\") pod \"test-operator-controller-manager-55b5ff4dbb-56p6p\" (UID: \"c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717483 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vrq\" (UniqueName: \"kubernetes.io/projected/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-kube-api-access-p7vrq\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717529 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl59w\" (UniqueName: \"kubernetes.io/projected/146c7192-f416-4f39-8e9b-3b14da8545f7-kube-api-access-jl59w\") pod \"swift-operator-controller-manager-9b9ff9f4d-n6mwk\" (UID: \"146c7192-f416-4f39-8e9b-3b14da8545f7\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717566 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.718019 master-0 kubenswrapper[23041]: I0308 00:49:24.717586 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.747842 master-0 kubenswrapper[23041]: I0308 00:49:24.747769 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ntr\" (UniqueName: \"kubernetes.io/projected/7abab89c-1b33-4b90-896e-ae546a94572d-kube-api-access-z4ntr\") pod \"telemetry-operator-controller-manager-5fdb694969-sqmsb\" (UID: \"7abab89c-1b33-4b90-896e-ae546a94572d\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:24.766228 master-0 kubenswrapper[23041]: I0308 00:49:24.760112 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhq8t\" (UniqueName: \"kubernetes.io/projected/11fd0e3e-abd4-4c9a-b48b-b1876062d035-kube-api-access-fhq8t\") pod \"placement-operator-controller-manager-648564c9fc-qlhb7\" (UID: \"11fd0e3e-abd4-4c9a-b48b-b1876062d035\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:24.766228 master-0 kubenswrapper[23041]: I0308 00:49:24.760228 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl59w\" (UniqueName: \"kubernetes.io/projected/146c7192-f416-4f39-8e9b-3b14da8545f7-kube-api-access-jl59w\") pod \"swift-operator-controller-manager-9b9ff9f4d-n6mwk\" (UID: \"146c7192-f416-4f39-8e9b-3b14da8545f7\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:24.766228 master-0 kubenswrapper[23041]: I0308 00:49:24.760856 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghb28\" (UniqueName: \"kubernetes.io/projected/c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7-kube-api-access-ghb28\") pod \"test-operator-controller-manager-55b5ff4dbb-56p6p\" (UID: \"c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:24.781325 master-0 kubenswrapper[23041]: I0308 00:49:24.781275 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:24.793266 master-0 kubenswrapper[23041]: I0308 00:49:24.788002 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7"] Mar 08 00:49:24.808069 master-0 kubenswrapper[23041]: I0308 00:49:24.808028 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:24.821537 master-0 kubenswrapper[23041]: I0308 00:49:24.821365 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xck\" (UniqueName: \"kubernetes.io/projected/c365bf07-c22a-4abe-9af1-4fafbfb7659f-kube-api-access-g8xck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6fnf7\" (UID: \"c365bf07-c22a-4abe-9af1-4fafbfb7659f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" Mar 08 00:49:24.821537 master-0 kubenswrapper[23041]: I0308 00:49:24.821460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gblzv\" (UniqueName: \"kubernetes.io/projected/8a999920-adc2-499a-a349-4d0405263ee2-kube-api-access-gblzv\") pod \"watcher-operator-controller-manager-bccc79885-zvtv4\" (UID: \"8a999920-adc2-499a-a349-4d0405263ee2\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: I0308 00:49:24.821863 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vrq\" (UniqueName: \"kubernetes.io/projected/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-kube-api-access-p7vrq\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: I0308 00:49:24.822006 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: I0308 00:49:24.822035 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: E0308 00:49:24.822193 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: E0308 00:49:24.822278 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:25.322256155 +0000 UTC m=+1070.795092709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: E0308 00:49:24.822420 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: E0308 00:49:24.822485 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:25.32246178 +0000 UTC m=+1070.795298334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:24.825022 master-0 kubenswrapper[23041]: I0308 00:49:24.824974 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:24.863623 master-0 kubenswrapper[23041]: I0308 00:49:24.853561 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vrq\" (UniqueName: \"kubernetes.io/projected/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-kube-api-access-p7vrq\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:24.878342 master-0 kubenswrapper[23041]: I0308 00:49:24.869669 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gblzv\" (UniqueName: \"kubernetes.io/projected/8a999920-adc2-499a-a349-4d0405263ee2-kube-api-access-gblzv\") pod \"watcher-operator-controller-manager-bccc79885-zvtv4\" (UID: \"8a999920-adc2-499a-a349-4d0405263ee2\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:24.878342 master-0 kubenswrapper[23041]: I0308 00:49:24.869896 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:24.925189 master-0 kubenswrapper[23041]: I0308 00:49:24.925112 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:24.930390 master-0 kubenswrapper[23041]: I0308 00:49:24.930330 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xck\" (UniqueName: \"kubernetes.io/projected/c365bf07-c22a-4abe-9af1-4fafbfb7659f-kube-api-access-g8xck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6fnf7\" (UID: \"c365bf07-c22a-4abe-9af1-4fafbfb7659f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" Mar 08 00:49:24.972568 master-0 kubenswrapper[23041]: I0308 00:49:24.972373 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xck\" (UniqueName: \"kubernetes.io/projected/c365bf07-c22a-4abe-9af1-4fafbfb7659f-kube-api-access-g8xck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6fnf7\" (UID: \"c365bf07-c22a-4abe-9af1-4fafbfb7659f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" Mar 08 00:49:24.980068 master-0 kubenswrapper[23041]: I0308 00:49:24.977679 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4"] Mar 08 00:49:25.053758 master-0 kubenswrapper[23041]: I0308 00:49:25.051828 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:25.075125 master-0 kubenswrapper[23041]: I0308 00:49:25.073409 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" Mar 08 00:49:25.155935 master-0 kubenswrapper[23041]: I0308 00:49:25.153888 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:25.155935 master-0 kubenswrapper[23041]: E0308 00:49:25.154126 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:25.155935 master-0 kubenswrapper[23041]: E0308 00:49:25.154176 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:26.154159942 +0000 UTC m=+1071.626996496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:25.176065 master-0 kubenswrapper[23041]: I0308 00:49:25.175869 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw"] Mar 08 00:49:25.314870 master-0 kubenswrapper[23041]: I0308 00:49:25.314732 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6"] Mar 08 00:49:25.322774 master-0 kubenswrapper[23041]: I0308 00:49:25.322707 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz"] Mar 08 00:49:25.362044 master-0 kubenswrapper[23041]: I0308 00:49:25.361981 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:25.362044 master-0 kubenswrapper[23041]: I0308 00:49:25.362043 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:25.362379 master-0 kubenswrapper[23041]: E0308 00:49:25.362274 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:25.362379 master-0 kubenswrapper[23041]: E0308 00:49:25.362323 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:26.362310092 +0000 UTC m=+1071.835146646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:25.363274 master-0 kubenswrapper[23041]: E0308 00:49:25.363223 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:25.363274 master-0 kubenswrapper[23041]: E0308 00:49:25.363265 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:26.363256774 +0000 UTC m=+1071.836093328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:25.466330 master-0 kubenswrapper[23041]: I0308 00:49:25.464162 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:25.466330 master-0 kubenswrapper[23041]: E0308 00:49:25.464435 23041 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:25.466330 master-0 kubenswrapper[23041]: E0308 00:49:25.464492 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert podName:1436364c-fa7f-4858-a6cf-c635fe431773 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:27.464469376 +0000 UTC m=+1072.937305930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert") pod "infra-operator-controller-manager-b8c8d7cc8-bhcdj" (UID: "1436364c-fa7f-4858-a6cf-c635fe431773") : secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:25.678305 master-0 kubenswrapper[23041]: I0308 00:49:25.674706 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" event={"ID":"084c05b0-52bc-45a5-99b6-cf3a01714e49","Type":"ContainerStarted","Data":"c30ac517afa068c103f95f4140e6bea4c7c38c2d4d2fa4a304aca78a3afc2a78"} Mar 08 00:49:25.683225 master-0 kubenswrapper[23041]: I0308 00:49:25.681789 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" event={"ID":"7a2905e2-df5b-4d3e-811a-f0517a0f7ef8","Type":"ContainerStarted","Data":"583f82f66336a8af71e4953cbf42b2a9ac879106a72b427c0e976f455d0813ec"} Mar 08 00:49:25.683225 master-0 kubenswrapper[23041]: I0308 00:49:25.683140 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" event={"ID":"814a072f-7dc0-4158-b22f-3cf285afbb10","Type":"ContainerStarted","Data":"556138124a493e691128d578629cec6b1ed55861bb18368938a1a11eab49418e"} Mar 08 00:49:25.685041 master-0 kubenswrapper[23041]: I0308 00:49:25.684462 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz"] Mar 08 00:49:25.777288 master-0 kubenswrapper[23041]: I0308 00:49:25.774364 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv"] Mar 08 00:49:26.121402 master-0 kubenswrapper[23041]: I0308 00:49:26.121296 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t"] Mar 08 00:49:26.177135 master-0 kubenswrapper[23041]: I0308 00:49:26.171990 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp"] Mar 08 00:49:26.222233 master-0 kubenswrapper[23041]: I0308 00:49:26.216885 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:26.222233 master-0 kubenswrapper[23041]: E0308 00:49:26.217157 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:26.222233 master-0 kubenswrapper[23041]: E0308 00:49:26.217224 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:28.217194488 +0000 UTC m=+1073.690031042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:26.222602 master-0 kubenswrapper[23041]: I0308 00:49:26.222438 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2"] Mar 08 00:49:26.314228 master-0 kubenswrapper[23041]: I0308 00:49:26.309128 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2"] Mar 08 00:49:26.320705 master-0 kubenswrapper[23041]: I0308 00:49:26.319796 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-686765764-jhdvn"] Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: I0308 00:49:26.420280 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: I0308 00:49:26.420414 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: E0308 00:49:26.420476 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: E0308 00:49:26.420657 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:28.420619455 +0000 UTC m=+1073.893456069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: E0308 00:49:26.420707 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:26.422231 master-0 kubenswrapper[23041]: E0308 00:49:26.421088 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:28.42083777 +0000 UTC m=+1073.893674524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:26.713356 master-0 kubenswrapper[23041]: I0308 00:49:26.712294 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" event={"ID":"d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b","Type":"ContainerStarted","Data":"f79565a3ce719feb6706bd97a921cae9d37d3b8d940bce331df8dfe18b8ae48c"} Mar 08 00:49:26.714502 master-0 kubenswrapper[23041]: I0308 00:49:26.714241 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" event={"ID":"493e7565-9522-49f3-aa7d-53f3808c241c","Type":"ContainerStarted","Data":"e982800f842f4af8499af5c7e98fa21a502bd18cfbf9777c6b11694001687295"} Mar 08 00:49:26.716868 master-0 kubenswrapper[23041]: I0308 00:49:26.716770 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" event={"ID":"bbbe59bf-ba15-4378-bc11-5883dd1cd33a","Type":"ContainerStarted","Data":"12ebe1650f1e2227a77f353113cc1799ce9e79a6a8bd5d763d46c3911b1a8858"} Mar 08 00:49:26.733955 master-0 kubenswrapper[23041]: I0308 00:49:26.733857 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" event={"ID":"759c181c-c143-4285-8530-0936f249b97d","Type":"ContainerStarted","Data":"58fee9f19639ef9655388a9c855a686ea91ef938a3f53051d5e35e84b95594da"} Mar 08 00:49:26.736594 master-0 kubenswrapper[23041]: I0308 00:49:26.736478 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" event={"ID":"a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9","Type":"ContainerStarted","Data":"22924b2614cc3b4d3d4193ea1d9f120340f97d7393d2b983545cea9c4406e8cf"} Mar 08 00:49:26.742603 master-0 kubenswrapper[23041]: I0308 00:49:26.742479 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" event={"ID":"aaf0aa97-41c4-488a-b5a7-1b72b777a70c","Type":"ContainerStarted","Data":"fa3f22ceaa84f99aa891d28326b2877dd4310fae120fd103e6eee6f23868dbf1"} Mar 08 00:49:26.745725 master-0 kubenswrapper[23041]: I0308 00:49:26.745667 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" event={"ID":"70b24627-00b3-4e46-b5a9-630ae001f9d6","Type":"ContainerStarted","Data":"aaf0ede54ccc4cd952a84279b8180c6791196bb104cf1d121c7838b048a3e698"} Mar 08 00:49:27.056351 master-0 kubenswrapper[23041]: I0308 00:49:27.053433 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7"] Mar 08 00:49:27.088367 master-0 kubenswrapper[23041]: I0308 00:49:27.086060 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr"] Mar 08 00:49:27.126358 master-0 kubenswrapper[23041]: I0308 00:49:27.123573 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk"] Mar 08 00:49:27.136498 master-0 kubenswrapper[23041]: I0308 00:49:27.136380 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p"] Mar 08 00:49:27.148480 master-0 kubenswrapper[23041]: I0308 00:49:27.148414 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb"] Mar 08 00:49:27.158107 master-0 kubenswrapper[23041]: I0308 00:49:27.158048 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b"] Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.275537 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7"] Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.276776 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6"] Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.287064 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4"] Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.554053 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: E0308 00:49:27.554407 23041 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: E0308 00:49:27.554462 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert podName:1436364c-fa7f-4858-a6cf-c635fe431773 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:31.554444869 +0000 UTC m=+1077.027281423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert") pod "infra-operator-controller-manager-b8c8d7cc8-bhcdj" (UID: "1436364c-fa7f-4858-a6cf-c635fe431773") : secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.754854 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" event={"ID":"3840c315-d22c-4609-8f86-f905cb6895e9","Type":"ContainerStarted","Data":"8af1f874534513939cdefb1414bc07d549a4ac323a56e09b3ef679c7cf2dec98"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.755767 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" event={"ID":"8a999920-adc2-499a-a349-4d0405263ee2","Type":"ContainerStarted","Data":"71ff8cca86dcf79d38549c304f4e82e7df6243438cfbd2a954dca19e5519688b"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.756936 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" event={"ID":"7abab89c-1b33-4b90-896e-ae546a94572d","Type":"ContainerStarted","Data":"0ab3ac9f4a2fbb867672351f82779c930996a9b96223590730e043ee2cf9f1cf"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.757717 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" event={"ID":"11fd0e3e-abd4-4c9a-b48b-b1876062d035","Type":"ContainerStarted","Data":"c729de1292f704772f383925bffafc48c58917feae149c70eb7ecd8dbfab267a"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.758385 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" event={"ID":"146c7192-f416-4f39-8e9b-3b14da8545f7","Type":"ContainerStarted","Data":"34a4faf0256c88ccb3d84a1852a65281f0d297d46f76cc778ce6c2fe2f9a017d"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.759056 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" event={"ID":"c365bf07-c22a-4abe-9af1-4fafbfb7659f","Type":"ContainerStarted","Data":"57f462389a59dab47ba791939f2ff761b96d861d0d1cbb7b5e7abe0d3a95726c"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.759733 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" event={"ID":"c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7","Type":"ContainerStarted","Data":"0f262feed9a3d2cfeadee8fb4a59685ea16536805166736e91a29fdeaee30158"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.760399 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" event={"ID":"72ae432b-fa05-4df8-a7f4-43c7ea765691","Type":"ContainerStarted","Data":"52614c0a4a9fbe87650030049bd9b1f36d20e4077b1bf08f11e6aaeee8c0d7d5"} Mar 08 00:49:28.139505 master-0 kubenswrapper[23041]: I0308 00:49:27.761136 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" event={"ID":"a75721b7-d12b-4337-a9f2-1c68ed0225b1","Type":"ContainerStarted","Data":"5f0de1855e5f97d730d06b039f0361ee899120bbb00f9cebb5561e1df8efca36"} Mar 08 00:49:28.275454 master-0 kubenswrapper[23041]: I0308 00:49:28.272229 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:28.275454 master-0 kubenswrapper[23041]: E0308 00:49:28.272576 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:28.275454 master-0 kubenswrapper[23041]: E0308 00:49:28.272665 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:32.272634259 +0000 UTC m=+1077.745470813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:28.493083 master-0 kubenswrapper[23041]: I0308 00:49:28.492912 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:28.493308 master-0 kubenswrapper[23041]: E0308 00:49:28.493158 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:28.493448 master-0 kubenswrapper[23041]: E0308 00:49:28.493329 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:32.493296657 +0000 UTC m=+1077.966133201 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:28.496232 master-0 kubenswrapper[23041]: I0308 00:49:28.493777 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:28.496232 master-0 kubenswrapper[23041]: E0308 00:49:28.494581 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:28.496232 master-0 kubenswrapper[23041]: E0308 00:49:28.494626 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:32.494611148 +0000 UTC m=+1077.967447702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:32.005676 master-0 kubenswrapper[23041]: I0308 00:49:32.004700 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:32.005676 master-0 kubenswrapper[23041]: E0308 00:49:32.005110 23041 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:32.005676 master-0 kubenswrapper[23041]: E0308 00:49:32.005177 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert podName:1436364c-fa7f-4858-a6cf-c635fe431773 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:40.005156751 +0000 UTC m=+1085.477993305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert") pod "infra-operator-controller-manager-b8c8d7cc8-bhcdj" (UID: "1436364c-fa7f-4858-a6cf-c635fe431773") : secret "infra-operator-webhook-server-cert" not found Mar 08 00:49:32.320001 master-0 kubenswrapper[23041]: I0308 00:49:32.319363 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:32.320001 master-0 kubenswrapper[23041]: E0308 00:49:32.319566 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:32.320001 master-0 kubenswrapper[23041]: E0308 00:49:32.319619 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:40.319602357 +0000 UTC m=+1085.792438911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:32.523453 master-0 kubenswrapper[23041]: I0308 00:49:32.523390 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:32.523453 master-0 kubenswrapper[23041]: I0308 00:49:32.523458 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:32.523742 master-0 kubenswrapper[23041]: E0308 00:49:32.523688 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:32.523742 master-0 kubenswrapper[23041]: E0308 00:49:32.523740 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:40.52372531 +0000 UTC m=+1085.996561864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:32.523856 master-0 kubenswrapper[23041]: E0308 00:49:32.523842 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:32.523893 master-0 kubenswrapper[23041]: E0308 00:49:32.523865 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:40.523857393 +0000 UTC m=+1085.996693947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:40.057606 master-0 kubenswrapper[23041]: I0308 00:49:40.057524 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:40.061068 master-0 kubenswrapper[23041]: I0308 00:49:40.061022 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1436364c-fa7f-4858-a6cf-c635fe431773-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-bhcdj\" (UID: \"1436364c-fa7f-4858-a6cf-c635fe431773\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:40.211307 master-0 kubenswrapper[23041]: I0308 00:49:40.210255 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:40.364055 master-0 kubenswrapper[23041]: I0308 00:49:40.363933 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:40.364326 master-0 kubenswrapper[23041]: E0308 00:49:40.364264 23041 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:40.364446 master-0 kubenswrapper[23041]: E0308 00:49:40.364402 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert podName:add7710b-ad19-4be8-b7fe-77e7107961d3 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:56.364382562 +0000 UTC m=+1101.837219136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" (UID: "add7710b-ad19-4be8-b7fe-77e7107961d3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 00:49:40.567756 master-0 kubenswrapper[23041]: I0308 00:49:40.567675 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:40.567756 master-0 kubenswrapper[23041]: I0308 00:49:40.567739 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:40.567980 master-0 kubenswrapper[23041]: E0308 00:49:40.567870 23041 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 00:49:40.567980 master-0 kubenswrapper[23041]: E0308 00:49:40.567935 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:56.567916821 +0000 UTC m=+1102.040753375 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "webhook-server-cert" not found Mar 08 00:49:40.567980 master-0 kubenswrapper[23041]: E0308 00:49:40.567936 23041 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 00:49:40.568092 master-0 kubenswrapper[23041]: E0308 00:49:40.567984 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs podName:d3e9d18b-4046-43ab-9b61-efe1207ccaf7 nodeName:}" failed. No retries permitted until 2026-03-08 00:49:56.567969932 +0000 UTC m=+1102.040806496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs") pod "openstack-operator-controller-manager-85db8c7646-pflgk" (UID: "d3e9d18b-4046-43ab-9b61-efe1207ccaf7") : secret "metrics-server-cert" not found Mar 08 00:49:53.509753 master-0 kubenswrapper[23041]: I0308 00:49:53.509706 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj"] Mar 08 00:49:54.626142 master-0 kubenswrapper[23041]: I0308 00:49:54.623316 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" event={"ID":"8a999920-adc2-499a-a349-4d0405263ee2","Type":"ContainerStarted","Data":"c780231305043ce0fe06495de4c252af010fa65907d45254963ade3fa9dc934a"} Mar 08 00:49:54.626142 master-0 kubenswrapper[23041]: I0308 00:49:54.624827 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:49:54.626763 master-0 kubenswrapper[23041]: I0308 00:49:54.626371 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" event={"ID":"bbbe59bf-ba15-4378-bc11-5883dd1cd33a","Type":"ContainerStarted","Data":"5ac52b7e0ea2547d08439e303b0818b1e5f559237b266d419270ce628ef82124"} Mar 08 00:49:54.628230 master-0 kubenswrapper[23041]: I0308 00:49:54.626990 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:49:54.637896 master-0 kubenswrapper[23041]: I0308 00:49:54.637832 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" event={"ID":"faa673d5-160f-478d-9200-147f13e70e65","Type":"ContainerStarted","Data":"89baf39057b6fe45af4e9f125cd438f7f3bfa5501e30aa4319af6e1248ae37df"} Mar 08 00:49:54.640233 master-0 kubenswrapper[23041]: I0308 00:49:54.638774 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:49:54.658241 master-0 kubenswrapper[23041]: I0308 00:49:54.657375 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" event={"ID":"72ae432b-fa05-4df8-a7f4-43c7ea765691","Type":"ContainerStarted","Data":"5a9cf6a6018fd1467f2538a61d3af108e49b2fef672df230428744f072d97052"} Mar 08 00:49:54.658478 master-0 kubenswrapper[23041]: I0308 00:49:54.658301 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:49:54.721574 master-0 kubenswrapper[23041]: I0308 00:49:54.702744 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" podStartSLOduration=5.792501044 podStartE2EDuration="31.702719061s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.204165943 +0000 UTC m=+1072.677002497" lastFinishedPulling="2026-03-08 00:49:53.11438397 +0000 UTC m=+1098.587220514" observedRunningTime="2026-03-08 00:49:54.689600478 +0000 UTC m=+1100.162437032" watchObservedRunningTime="2026-03-08 00:49:54.702719061 +0000 UTC m=+1100.175555615" Mar 08 00:49:54.733240 master-0 kubenswrapper[23041]: I0308 00:49:54.732496 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" event={"ID":"a75721b7-d12b-4337-a9f2-1c68ed0225b1","Type":"ContainerStarted","Data":"aa7c00fe54ac0419a3afc294734ddfb7fbb71be4b69b5c75786a97af2c12fe65"} Mar 08 00:49:54.745228 master-0 kubenswrapper[23041]: I0308 00:49:54.734179 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:49:54.755311 master-0 kubenswrapper[23041]: I0308 00:49:54.755218 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" podStartSLOduration=3.6221065660000002 podStartE2EDuration="31.755171331s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:24.441336279 +0000 UTC m=+1069.914172833" lastFinishedPulling="2026-03-08 00:49:52.574401044 +0000 UTC m=+1098.047237598" observedRunningTime="2026-03-08 00:49:54.749039125 +0000 UTC m=+1100.221875679" watchObservedRunningTime="2026-03-08 00:49:54.755171331 +0000 UTC m=+1100.228007885" Mar 08 00:49:54.776255 master-0 kubenswrapper[23041]: I0308 00:49:54.763529 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" event={"ID":"759c181c-c143-4285-8530-0936f249b97d","Type":"ContainerStarted","Data":"3a04ff25f7cbc78a6f65fa04de60404642f14494644c32107f3348ac7fcb6fb5"} Mar 08 00:49:54.776255 master-0 kubenswrapper[23041]: I0308 00:49:54.764620 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:49:54.776255 master-0 kubenswrapper[23041]: I0308 00:49:54.766002 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" event={"ID":"aaf0aa97-41c4-488a-b5a7-1b72b777a70c","Type":"ContainerStarted","Data":"e27ac91c9c9ffe9d0fdddd5456766aa8d4004b13f3644da433003f4a0ae158af"} Mar 08 00:49:54.776255 master-0 kubenswrapper[23041]: I0308 00:49:54.768459 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:49:54.802256 master-0 kubenswrapper[23041]: I0308 00:49:54.801908 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" event={"ID":"493e7565-9522-49f3-aa7d-53f3808c241c","Type":"ContainerStarted","Data":"7fddb81dacb09cd088ffbe3db75db31eaed10412e12d0c415f78b7c385c76f78"} Mar 08 00:49:54.802605 master-0 kubenswrapper[23041]: I0308 00:49:54.802539 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:49:54.854578 master-0 kubenswrapper[23041]: I0308 00:49:54.853894 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" event={"ID":"7abab89c-1b33-4b90-896e-ae546a94572d","Type":"ContainerStarted","Data":"be5eb9f1520c64564bed8ae078153740c7897151946da662c098235d0f891a68"} Mar 08 00:49:54.854783 master-0 kubenswrapper[23041]: I0308 00:49:54.854675 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:49:54.860221 master-0 kubenswrapper[23041]: I0308 00:49:54.857832 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" event={"ID":"11fd0e3e-abd4-4c9a-b48b-b1876062d035","Type":"ContainerStarted","Data":"15e0d9eb3ea9bb1ed03004b73195c6e36ac6ed0586f345bb8809394a8f2ffa31"} Mar 08 00:49:54.860221 master-0 kubenswrapper[23041]: I0308 00:49:54.858399 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:49:54.872231 master-0 kubenswrapper[23041]: I0308 00:49:54.871139 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" event={"ID":"1436364c-fa7f-4858-a6cf-c635fe431773","Type":"ContainerStarted","Data":"70c8ddd8c793bdbcfdaa20dfddddb6e3789dea043b91c54b82646efb3ff8f69a"} Mar 08 00:49:54.878297 master-0 kubenswrapper[23041]: I0308 00:49:54.874003 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" podStartSLOduration=4.575699015 podStartE2EDuration="31.873991052s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:25.817061366 +0000 UTC m=+1071.289897920" lastFinishedPulling="2026-03-08 00:49:53.115353403 +0000 UTC m=+1098.588189957" observedRunningTime="2026-03-08 00:49:54.870889898 +0000 UTC m=+1100.343726452" watchObservedRunningTime="2026-03-08 00:49:54.873991052 +0000 UTC m=+1100.346827606" Mar 08 00:49:54.896236 master-0 kubenswrapper[23041]: I0308 00:49:54.891481 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" event={"ID":"70b24627-00b3-4e46-b5a9-630ae001f9d6","Type":"ContainerStarted","Data":"786780d3d8ef695a88456ddc3435ed8852aba31709b9397c43bce358a99365fb"} Mar 08 00:49:54.896236 master-0 kubenswrapper[23041]: I0308 00:49:54.892299 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:49:54.915380 master-0 kubenswrapper[23041]: I0308 00:49:54.915309 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" event={"ID":"3840c315-d22c-4609-8f86-f905cb6895e9","Type":"ContainerStarted","Data":"4218c0d1df499d39bdc7ff3b2c3d6a891f831402a5b5799e89ee0e14a9271a2a"} Mar 08 00:49:54.916503 master-0 kubenswrapper[23041]: I0308 00:49:54.916484 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:49:54.927279 master-0 kubenswrapper[23041]: I0308 00:49:54.926806 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" event={"ID":"814a072f-7dc0-4158-b22f-3cf285afbb10","Type":"ContainerStarted","Data":"fdbbe130d7f34b70a382e4e7d5b8d6a4637a14b96f605ed83e261258ca30ba69"} Mar 08 00:49:54.929226 master-0 kubenswrapper[23041]: I0308 00:49:54.927597 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:49:54.936231 master-0 kubenswrapper[23041]: I0308 00:49:54.935399 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" event={"ID":"c365bf07-c22a-4abe-9af1-4fafbfb7659f","Type":"ContainerStarted","Data":"96d76e6a05baf171962eeca221a0bba8386bdcbf5c10dbd6bc54cd077984935e"} Mar 08 00:49:54.949236 master-0 kubenswrapper[23041]: I0308 00:49:54.947431 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" event={"ID":"084c05b0-52bc-45a5-99b6-cf3a01714e49","Type":"ContainerStarted","Data":"a3fc4c51efdcee81d5bfede49c5e489d38ce99a1fe3a7b60805e7eaa4eb9ec67"} Mar 08 00:49:54.949236 master-0 kubenswrapper[23041]: I0308 00:49:54.948080 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:49:54.954231 master-0 kubenswrapper[23041]: I0308 00:49:54.954169 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" event={"ID":"146c7192-f416-4f39-8e9b-3b14da8545f7","Type":"ContainerStarted","Data":"5f00081db2343ca1c1f1a788943b4d9bfa470334c1e60eecf23702bd83896735"} Mar 08 00:49:54.959216 master-0 kubenswrapper[23041]: I0308 00:49:54.955933 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" podStartSLOduration=5.99503623 podStartE2EDuration="31.955917654s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.138645562 +0000 UTC m=+1072.611482116" lastFinishedPulling="2026-03-08 00:49:53.099526986 +0000 UTC m=+1098.572363540" observedRunningTime="2026-03-08 00:49:54.951756424 +0000 UTC m=+1100.424592998" watchObservedRunningTime="2026-03-08 00:49:54.955917654 +0000 UTC m=+1100.428754208" Mar 08 00:49:54.986240 master-0 kubenswrapper[23041]: I0308 00:49:54.982508 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" event={"ID":"d7ac0595-0fbf-47e2-b64e-cf2e0a17a61b","Type":"ContainerStarted","Data":"04ca4054c8d3b85f5a587d5985f680794fb94b150f759e5ce2b50026c3af2845"} Mar 08 00:49:54.986240 master-0 kubenswrapper[23041]: I0308 00:49:54.983921 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:49:55.018653 master-0 kubenswrapper[23041]: I0308 00:49:55.014222 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" event={"ID":"c19bbc8b-95aa-4387-b3ce-7e0730c7f6b7","Type":"ContainerStarted","Data":"1472566732dd0773682179ab3f40c0b94b0d0d77893babdf420085f90732b240"} Mar 08 00:49:55.018653 master-0 kubenswrapper[23041]: I0308 00:49:55.014269 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:49:55.026242 master-0 kubenswrapper[23041]: I0308 00:49:55.025841 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" event={"ID":"a3ec7a88-e9b2-4d0c-be5f-9ee71a9baec9","Type":"ContainerStarted","Data":"1927a500eddcdfb42f4a5c5f68ea0225b11814e72967d15c377a89fe52ba7bc6"} Mar 08 00:49:55.027236 master-0 kubenswrapper[23041]: I0308 00:49:55.026661 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:49:55.036079 master-0 kubenswrapper[23041]: I0308 00:49:55.035710 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" podStartSLOduration=5.133947644 podStartE2EDuration="32.035695534s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:26.196648989 +0000 UTC m=+1071.669485543" lastFinishedPulling="2026-03-08 00:49:53.098396879 +0000 UTC m=+1098.571233433" observedRunningTime="2026-03-08 00:49:55.035692624 +0000 UTC m=+1100.508529168" watchObservedRunningTime="2026-03-08 00:49:55.035695534 +0000 UTC m=+1100.508532088" Mar 08 00:49:55.041234 master-0 kubenswrapper[23041]: I0308 00:49:55.040273 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" event={"ID":"7a2905e2-df5b-4d3e-811a-f0517a0f7ef8","Type":"ContainerStarted","Data":"efb6358db965a9c1d85cf8b758978240f3457e98ed1146b6d871ec078f1cb6bf"} Mar 08 00:49:55.041234 master-0 kubenswrapper[23041]: I0308 00:49:55.041065 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:49:55.099247 master-0 kubenswrapper[23041]: I0308 00:49:55.093602 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6fnf7" podStartSLOduration=4.969541742 podStartE2EDuration="31.093573103s" podCreationTimestamp="2026-03-08 00:49:24 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.139877261 +0000 UTC m=+1072.612713815" lastFinishedPulling="2026-03-08 00:49:53.263908622 +0000 UTC m=+1098.736745176" observedRunningTime="2026-03-08 00:49:55.062214956 +0000 UTC m=+1100.535051510" watchObservedRunningTime="2026-03-08 00:49:55.093573103 +0000 UTC m=+1100.566409657" Mar 08 00:49:55.144293 master-0 kubenswrapper[23041]: I0308 00:49:55.115156 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" podStartSLOduration=6.02022392 podStartE2EDuration="32.115138137s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.096553019 +0000 UTC m=+1072.569389573" lastFinishedPulling="2026-03-08 00:49:53.191467236 +0000 UTC m=+1098.664303790" observedRunningTime="2026-03-08 00:49:55.114266606 +0000 UTC m=+1100.587103160" watchObservedRunningTime="2026-03-08 00:49:55.115138137 +0000 UTC m=+1100.587974691" Mar 08 00:49:55.425297 master-0 kubenswrapper[23041]: I0308 00:49:55.424502 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" podStartSLOduration=4.047589793 podStartE2EDuration="32.424478837s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:24.703866094 +0000 UTC m=+1070.176702648" lastFinishedPulling="2026-03-08 00:49:53.080755138 +0000 UTC m=+1098.553591692" observedRunningTime="2026-03-08 00:49:55.163561971 +0000 UTC m=+1100.636398525" watchObservedRunningTime="2026-03-08 00:49:55.424478837 +0000 UTC m=+1100.897315391" Mar 08 00:49:55.425607 master-0 kubenswrapper[23041]: I0308 00:49:55.425550 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" podStartSLOduration=5.142178632 podStartE2EDuration="32.425543123s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:25.796347873 +0000 UTC m=+1071.269184427" lastFinishedPulling="2026-03-08 00:49:53.079712364 +0000 UTC m=+1098.552548918" observedRunningTime="2026-03-08 00:49:55.421051785 +0000 UTC m=+1100.893888349" watchObservedRunningTime="2026-03-08 00:49:55.425543123 +0000 UTC m=+1100.898379677" Mar 08 00:49:55.477231 master-0 kubenswrapper[23041]: I0308 00:49:55.476794 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" podStartSLOduration=5.48673007 podStartE2EDuration="32.476767943s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:26.178966978 +0000 UTC m=+1071.651803532" lastFinishedPulling="2026-03-08 00:49:53.169004861 +0000 UTC m=+1098.641841405" observedRunningTime="2026-03-08 00:49:55.455023735 +0000 UTC m=+1100.927860289" watchObservedRunningTime="2026-03-08 00:49:55.476767943 +0000 UTC m=+1100.949604497" Mar 08 00:49:55.509038 master-0 kubenswrapper[23041]: I0308 00:49:55.508963 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" podStartSLOduration=4.518126233 podStartE2EDuration="32.50894862s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:25.088853626 +0000 UTC m=+1070.561690180" lastFinishedPulling="2026-03-08 00:49:53.079676023 +0000 UTC m=+1098.552512567" observedRunningTime="2026-03-08 00:49:55.507633888 +0000 UTC m=+1100.980470442" watchObservedRunningTime="2026-03-08 00:49:55.50894862 +0000 UTC m=+1100.981785174" Mar 08 00:49:55.545925 master-0 kubenswrapper[23041]: I0308 00:49:55.543008 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" podStartSLOduration=6.596992632 podStartE2EDuration="32.542991551s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.15241908 +0000 UTC m=+1072.625255634" lastFinishedPulling="2026-03-08 00:49:53.098418009 +0000 UTC m=+1098.571254553" observedRunningTime="2026-03-08 00:49:55.540974503 +0000 UTC m=+1101.013811077" watchObservedRunningTime="2026-03-08 00:49:55.542991551 +0000 UTC m=+1101.015828105" Mar 08 00:49:55.576242 master-0 kubenswrapper[23041]: I0308 00:49:55.575222 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" podStartSLOduration=6.514440665 podStartE2EDuration="32.575188118s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.060359587 +0000 UTC m=+1072.533196141" lastFinishedPulling="2026-03-08 00:49:53.12110704 +0000 UTC m=+1098.593943594" observedRunningTime="2026-03-08 00:49:55.572996206 +0000 UTC m=+1101.045832760" watchObservedRunningTime="2026-03-08 00:49:55.575188118 +0000 UTC m=+1101.048024672" Mar 08 00:49:55.622227 master-0 kubenswrapper[23041]: I0308 00:49:55.617187 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" podStartSLOduration=6.522362984 podStartE2EDuration="32.617170208s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.096753344 +0000 UTC m=+1072.569589898" lastFinishedPulling="2026-03-08 00:49:53.191560568 +0000 UTC m=+1098.664397122" observedRunningTime="2026-03-08 00:49:55.610436398 +0000 UTC m=+1101.083272962" watchObservedRunningTime="2026-03-08 00:49:55.617170208 +0000 UTC m=+1101.090006762" Mar 08 00:49:55.706136 master-0 kubenswrapper[23041]: I0308 00:49:55.702560 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" podStartSLOduration=5.703617697 podStartE2EDuration="32.702538392s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:26.191423724 +0000 UTC m=+1071.664260278" lastFinishedPulling="2026-03-08 00:49:53.190344429 +0000 UTC m=+1098.663180973" observedRunningTime="2026-03-08 00:49:55.671599945 +0000 UTC m=+1101.144436499" watchObservedRunningTime="2026-03-08 00:49:55.702538392 +0000 UTC m=+1101.175374946" Mar 08 00:49:55.735245 master-0 kubenswrapper[23041]: I0308 00:49:55.731751 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" podStartSLOduration=6.645440426 podStartE2EDuration="32.731724507s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.08272231 +0000 UTC m=+1072.555558864" lastFinishedPulling="2026-03-08 00:49:53.169006391 +0000 UTC m=+1098.641842945" observedRunningTime="2026-03-08 00:49:55.719576698 +0000 UTC m=+1101.192413252" watchObservedRunningTime="2026-03-08 00:49:55.731724507 +0000 UTC m=+1101.204561061" Mar 08 00:49:55.763325 master-0 kubenswrapper[23041]: I0308 00:49:55.762679 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" podStartSLOduration=6.7199966920000005 podStartE2EDuration="32.762660784s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:27.071235836 +0000 UTC m=+1072.544072390" lastFinishedPulling="2026-03-08 00:49:53.113899928 +0000 UTC m=+1098.586736482" observedRunningTime="2026-03-08 00:49:55.751624552 +0000 UTC m=+1101.224461106" watchObservedRunningTime="2026-03-08 00:49:55.762660784 +0000 UTC m=+1101.235497338" Mar 08 00:49:55.791219 master-0 kubenswrapper[23041]: I0308 00:49:55.787033 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" podStartSLOduration=5.836919934 podStartE2EDuration="32.787013955s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:26.171662774 +0000 UTC m=+1071.644499328" lastFinishedPulling="2026-03-08 00:49:53.121756795 +0000 UTC m=+1098.594593349" observedRunningTime="2026-03-08 00:49:55.784231358 +0000 UTC m=+1101.257067922" watchObservedRunningTime="2026-03-08 00:49:55.787013955 +0000 UTC m=+1101.259850529" Mar 08 00:49:55.823307 master-0 kubenswrapper[23041]: I0308 00:49:55.813352 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" podStartSLOduration=5.853633611 podStartE2EDuration="32.813335312s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:26.155182801 +0000 UTC m=+1071.628019355" lastFinishedPulling="2026-03-08 00:49:53.114884502 +0000 UTC m=+1098.587721056" observedRunningTime="2026-03-08 00:49:55.809604093 +0000 UTC m=+1101.282440647" watchObservedRunningTime="2026-03-08 00:49:55.813335312 +0000 UTC m=+1101.286171866" Mar 08 00:49:55.843218 master-0 kubenswrapper[23041]: I0308 00:49:55.842866 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" podStartSLOduration=4.883880458 podStartE2EDuration="32.842842105s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:25.120698305 +0000 UTC m=+1070.593534859" lastFinishedPulling="2026-03-08 00:49:53.079659932 +0000 UTC m=+1098.552496506" observedRunningTime="2026-03-08 00:49:55.832099939 +0000 UTC m=+1101.304936503" watchObservedRunningTime="2026-03-08 00:49:55.842842105 +0000 UTC m=+1101.315678659" Mar 08 00:49:56.056103 master-0 kubenswrapper[23041]: I0308 00:49:56.056018 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:49:56.440284 master-0 kubenswrapper[23041]: I0308 00:49:56.430002 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:56.440284 master-0 kubenswrapper[23041]: I0308 00:49:56.434244 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/add7710b-ad19-4be8-b7fe-77e7107961d3-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk\" (UID: \"add7710b-ad19-4be8-b7fe-77e7107961d3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:56.535418 master-0 kubenswrapper[23041]: I0308 00:49:56.535357 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:49:56.633652 master-0 kubenswrapper[23041]: I0308 00:49:56.633487 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:56.633652 master-0 kubenswrapper[23041]: I0308 00:49:56.633540 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:56.638857 master-0 kubenswrapper[23041]: I0308 00:49:56.638811 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-webhook-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:56.642360 master-0 kubenswrapper[23041]: I0308 00:49:56.642305 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3e9d18b-4046-43ab-9b61-efe1207ccaf7-metrics-certs\") pod \"openstack-operator-controller-manager-85db8c7646-pflgk\" (UID: \"d3e9d18b-4046-43ab-9b61-efe1207ccaf7\") " pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:56.860221 master-0 kubenswrapper[23041]: I0308 00:49:56.860171 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:56.975235 master-0 kubenswrapper[23041]: I0308 00:49:56.974247 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk"] Mar 08 00:49:57.347948 master-0 kubenswrapper[23041]: W0308 00:49:57.347876 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd7710b_ad19_4be8_b7fe_77e7107961d3.slice/crio-dd90702b581972a461a0e7184fc148c4a99eb3bef33719d9e62299cbb121fe82 WatchSource:0}: Error finding container dd90702b581972a461a0e7184fc148c4a99eb3bef33719d9e62299cbb121fe82: Status 404 returned error can't find the container with id dd90702b581972a461a0e7184fc148c4a99eb3bef33719d9e62299cbb121fe82 Mar 08 00:49:57.835236 master-0 kubenswrapper[23041]: I0308 00:49:57.832022 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk"] Mar 08 00:49:57.838753 master-0 kubenswrapper[23041]: W0308 00:49:57.835742 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e9d18b_4046_43ab_9b61_efe1207ccaf7.slice/crio-1e1080abac355cc17f736749ce5301e7d0611dd4d63ecf3bd4b84d8266a25121 WatchSource:0}: Error finding container 1e1080abac355cc17f736749ce5301e7d0611dd4d63ecf3bd4b84d8266a25121: Status 404 returned error can't find the container with id 1e1080abac355cc17f736749ce5301e7d0611dd4d63ecf3bd4b84d8266a25121 Mar 08 00:49:58.069671 master-0 kubenswrapper[23041]: I0308 00:49:58.069597 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" event={"ID":"add7710b-ad19-4be8-b7fe-77e7107961d3","Type":"ContainerStarted","Data":"dd90702b581972a461a0e7184fc148c4a99eb3bef33719d9e62299cbb121fe82"} Mar 08 00:49:58.073039 master-0 kubenswrapper[23041]: I0308 00:49:58.072229 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" event={"ID":"d3e9d18b-4046-43ab-9b61-efe1207ccaf7","Type":"ContainerStarted","Data":"0b564c6f21fbd5045f1e4e579b330a14d22ef90abb3cdde18bbf15676b2a7b4c"} Mar 08 00:49:58.073039 master-0 kubenswrapper[23041]: I0308 00:49:58.072280 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" event={"ID":"d3e9d18b-4046-43ab-9b61-efe1207ccaf7","Type":"ContainerStarted","Data":"1e1080abac355cc17f736749ce5301e7d0611dd4d63ecf3bd4b84d8266a25121"} Mar 08 00:49:58.073335 master-0 kubenswrapper[23041]: I0308 00:49:58.072972 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:49:58.075039 master-0 kubenswrapper[23041]: I0308 00:49:58.073732 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" event={"ID":"1436364c-fa7f-4858-a6cf-c635fe431773","Type":"ContainerStarted","Data":"9d14a25da02c8992ee43099cf5522589c080ace45932f09eccb22a6aa978ccd1"} Mar 08 00:49:58.075039 master-0 kubenswrapper[23041]: I0308 00:49:58.073959 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:49:58.110449 master-0 kubenswrapper[23041]: I0308 00:49:58.110261 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" podStartSLOduration=34.110242085 podStartE2EDuration="34.110242085s" podCreationTimestamp="2026-03-08 00:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:49:58.098908996 +0000 UTC m=+1103.571745570" watchObservedRunningTime="2026-03-08 00:49:58.110242085 +0000 UTC m=+1103.583078639" Mar 08 00:49:58.126057 master-0 kubenswrapper[23041]: I0308 00:49:58.125856 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" podStartSLOduration=31.296230966 podStartE2EDuration="35.125836347s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:53.58156507 +0000 UTC m=+1099.054401634" lastFinishedPulling="2026-03-08 00:49:57.411170461 +0000 UTC m=+1102.884007015" observedRunningTime="2026-03-08 00:49:58.120663733 +0000 UTC m=+1103.593500307" watchObservedRunningTime="2026-03-08 00:49:58.125836347 +0000 UTC m=+1103.598672901" Mar 08 00:50:00.093768 master-0 kubenswrapper[23041]: I0308 00:50:00.093715 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" event={"ID":"add7710b-ad19-4be8-b7fe-77e7107961d3","Type":"ContainerStarted","Data":"55f5e6ecee2b05eea88f3329a09201c7758d6890159d7643505217c6c0390cdc"} Mar 08 00:50:00.094287 master-0 kubenswrapper[23041]: I0308 00:50:00.093883 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:50:00.125927 master-0 kubenswrapper[23041]: I0308 00:50:00.125818 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" podStartSLOduration=35.126590865 podStartE2EDuration="37.125795606s" podCreationTimestamp="2026-03-08 00:49:23 +0000 UTC" firstStartedPulling="2026-03-08 00:49:57.350585387 +0000 UTC m=+1102.823421941" lastFinishedPulling="2026-03-08 00:49:59.349790128 +0000 UTC m=+1104.822626682" observedRunningTime="2026-03-08 00:50:00.123164543 +0000 UTC m=+1105.596001117" watchObservedRunningTime="2026-03-08 00:50:00.125795606 +0000 UTC m=+1105.598632170" Mar 08 00:50:03.536582 master-0 kubenswrapper[23041]: I0308 00:50:03.536493 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j9sh4" Mar 08 00:50:03.539512 master-0 kubenswrapper[23041]: I0308 00:50:03.539464 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rrvfw" Mar 08 00:50:03.821877 master-0 kubenswrapper[23041]: I0308 00:50:03.821693 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9tzwz" Mar 08 00:50:03.889674 master-0 kubenswrapper[23041]: I0308 00:50:03.889616 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-5hmg6" Mar 08 00:50:03.991099 master-0 kubenswrapper[23041]: I0308 00:50:03.991051 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-8wjbv" Mar 08 00:50:04.102242 master-0 kubenswrapper[23041]: I0308 00:50:04.102082 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-wq5pz" Mar 08 00:50:04.150392 master-0 kubenswrapper[23041]: I0308 00:50:04.150320 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-686765764-jhdvn" Mar 08 00:50:04.183237 master-0 kubenswrapper[23041]: I0308 00:50:04.182286 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-jrqk2" Mar 08 00:50:04.351022 master-0 kubenswrapper[23041]: I0308 00:50:04.350958 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-xwlf2" Mar 08 00:50:04.387671 master-0 kubenswrapper[23041]: I0308 00:50:04.387548 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l4w8t" Mar 08 00:50:04.437974 master-0 kubenswrapper[23041]: I0308 00:50:04.437479 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-b6ldp" Mar 08 00:50:04.605104 master-0 kubenswrapper[23041]: I0308 00:50:04.605062 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-vbkgr" Mar 08 00:50:04.647483 master-0 kubenswrapper[23041]: I0308 00:50:04.647293 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xgs7b" Mar 08 00:50:04.784154 master-0 kubenswrapper[23041]: I0308 00:50:04.784101 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qlhb7" Mar 08 00:50:04.822852 master-0 kubenswrapper[23041]: I0308 00:50:04.822779 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-n6mwk" Mar 08 00:50:04.829371 master-0 kubenswrapper[23041]: I0308 00:50:04.828686 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sqmsb" Mar 08 00:50:04.882276 master-0 kubenswrapper[23041]: I0308 00:50:04.880197 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-56p6p" Mar 08 00:50:04.929483 master-0 kubenswrapper[23041]: I0308 00:50:04.929363 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-nkfd6" Mar 08 00:50:05.059522 master-0 kubenswrapper[23041]: I0308 00:50:05.059464 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-zvtv4" Mar 08 00:50:06.542467 master-0 kubenswrapper[23041]: I0308 00:50:06.542406 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr22lk" Mar 08 00:50:06.867441 master-0 kubenswrapper[23041]: I0308 00:50:06.867297 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85db8c7646-pflgk" Mar 08 00:50:10.221691 master-0 kubenswrapper[23041]: I0308 00:50:10.221038 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-bhcdj" Mar 08 00:50:50.380531 master-0 kubenswrapper[23041]: I0308 00:50:50.380435 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:50:50.384932 master-0 kubenswrapper[23041]: I0308 00:50:50.384861 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.391610 master-0 kubenswrapper[23041]: I0308 00:50:50.389604 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 00:50:50.391610 master-0 kubenswrapper[23041]: I0308 00:50:50.389864 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 00:50:50.391610 master-0 kubenswrapper[23041]: I0308 00:50:50.389981 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 00:50:50.426944 master-0 kubenswrapper[23041]: I0308 00:50:50.426828 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:50:50.492656 master-0 kubenswrapper[23041]: I0308 00:50:50.492569 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.492656 master-0 kubenswrapper[23041]: I0308 00:50:50.492647 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dltz\" (UniqueName: \"kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.501177 master-0 kubenswrapper[23041]: I0308 00:50:50.500224 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:50:50.506978 master-0 kubenswrapper[23041]: I0308 00:50:50.503763 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.506978 master-0 kubenswrapper[23041]: I0308 00:50:50.506593 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 00:50:50.514984 master-0 kubenswrapper[23041]: I0308 00:50:50.514938 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:50:50.594644 master-0 kubenswrapper[23041]: I0308 00:50:50.594560 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.595002 master-0 kubenswrapper[23041]: I0308 00:50:50.594690 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dltz\" (UniqueName: \"kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.595002 master-0 kubenswrapper[23041]: I0308 00:50:50.594931 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.595116 master-0 kubenswrapper[23041]: I0308 00:50:50.595047 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.595168 master-0 kubenswrapper[23041]: I0308 00:50:50.595110 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrb5\" (UniqueName: \"kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.595560 master-0 kubenswrapper[23041]: I0308 00:50:50.595440 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.619417 master-0 kubenswrapper[23041]: I0308 00:50:50.619299 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dltz\" (UniqueName: \"kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz\") pod \"dnsmasq-dns-55994974c5-pnrh6\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.697586 master-0 kubenswrapper[23041]: I0308 00:50:50.697441 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.697586 master-0 kubenswrapper[23041]: I0308 00:50:50.697554 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.697850 master-0 kubenswrapper[23041]: I0308 00:50:50.697642 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrb5\" (UniqueName: \"kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.698622 master-0 kubenswrapper[23041]: I0308 00:50:50.698592 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.698946 master-0 kubenswrapper[23041]: I0308 00:50:50.698876 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.719186 master-0 kubenswrapper[23041]: I0308 00:50:50.719116 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrb5\" (UniqueName: \"kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5\") pod \"dnsmasq-dns-5d859fb5df-rqt5m\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:50.746913 master-0 kubenswrapper[23041]: I0308 00:50:50.746832 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:50:50.821260 master-0 kubenswrapper[23041]: I0308 00:50:50.820584 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:50:51.224515 master-0 kubenswrapper[23041]: I0308 00:50:51.224438 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:50:51.243314 master-0 kubenswrapper[23041]: W0308 00:50:51.243155 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cd4b2be_f3d5_4eec_8061_2055f3c4b001.slice/crio-739ad4863156a7e2adee37681cc0f51b19625992c73df294cdca14343bb6a83e WatchSource:0}: Error finding container 739ad4863156a7e2adee37681cc0f51b19625992c73df294cdca14343bb6a83e: Status 404 returned error can't find the container with id 739ad4863156a7e2adee37681cc0f51b19625992c73df294cdca14343bb6a83e Mar 08 00:50:51.430172 master-0 kubenswrapper[23041]: I0308 00:50:51.430029 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:50:51.436358 master-0 kubenswrapper[23041]: W0308 00:50:51.431970 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ab1df0_49b3_4091_89df_9a708f833de7.slice/crio-7c5f15e7bae7b35f01623b67b96ce7d180d8186b87a774d116ea827da6c4a487 WatchSource:0}: Error finding container 7c5f15e7bae7b35f01623b67b96ce7d180d8186b87a774d116ea827da6c4a487: Status 404 returned error can't find the container with id 7c5f15e7bae7b35f01623b67b96ce7d180d8186b87a774d116ea827da6c4a487 Mar 08 00:50:51.669950 master-0 kubenswrapper[23041]: I0308 00:50:51.669880 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" event={"ID":"3cd4b2be-f3d5-4eec-8061-2055f3c4b001","Type":"ContainerStarted","Data":"739ad4863156a7e2adee37681cc0f51b19625992c73df294cdca14343bb6a83e"} Mar 08 00:50:51.670834 master-0 kubenswrapper[23041]: I0308 00:50:51.670795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" event={"ID":"d3ab1df0-49b3-4091-89df-9a708f833de7","Type":"ContainerStarted","Data":"7c5f15e7bae7b35f01623b67b96ce7d180d8186b87a774d116ea827da6c4a487"} Mar 08 00:50:53.677560 master-0 kubenswrapper[23041]: I0308 00:50:53.677358 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:50:53.733881 master-0 kubenswrapper[23041]: I0308 00:50:53.733829 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:50:53.735583 master-0 kubenswrapper[23041]: I0308 00:50:53.735548 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.757923 master-0 kubenswrapper[23041]: I0308 00:50:53.756647 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:50:53.816587 master-0 kubenswrapper[23041]: I0308 00:50:53.816277 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.821979 master-0 kubenswrapper[23041]: I0308 00:50:53.820228 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.821979 master-0 kubenswrapper[23041]: I0308 00:50:53.820365 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zg78\" (UniqueName: \"kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.922647 master-0 kubenswrapper[23041]: I0308 00:50:53.922580 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.922647 master-0 kubenswrapper[23041]: I0308 00:50:53.922654 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zg78\" (UniqueName: \"kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.922903 master-0 kubenswrapper[23041]: I0308 00:50:53.922703 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.924096 master-0 kubenswrapper[23041]: I0308 00:50:53.924062 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.926431 master-0 kubenswrapper[23041]: I0308 00:50:53.926381 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:53.968418 master-0 kubenswrapper[23041]: I0308 00:50:53.967975 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zg78\" (UniqueName: \"kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78\") pod \"dnsmasq-dns-6779d95cff-xxcrz\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:54.093828 master-0 kubenswrapper[23041]: I0308 00:50:54.093279 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:50:54.110314 master-0 kubenswrapper[23041]: I0308 00:50:54.109971 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:50:54.124905 master-0 kubenswrapper[23041]: I0308 00:50:54.123519 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.134412 master-0 kubenswrapper[23041]: I0308 00:50:54.134345 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:50:54.171875 master-0 kubenswrapper[23041]: I0308 00:50:54.171269 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:50:54.239529 master-0 kubenswrapper[23041]: I0308 00:50:54.234575 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.239529 master-0 kubenswrapper[23041]: I0308 00:50:54.234883 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.239529 master-0 kubenswrapper[23041]: I0308 00:50:54.235029 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbhs\" (UniqueName: \"kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.340935 master-0 kubenswrapper[23041]: I0308 00:50:54.339945 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.340935 master-0 kubenswrapper[23041]: I0308 00:50:54.338106 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.340935 master-0 kubenswrapper[23041]: I0308 00:50:54.340686 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.341952 master-0 kubenswrapper[23041]: I0308 00:50:54.341824 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.342970 master-0 kubenswrapper[23041]: I0308 00:50:54.342557 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbhs\" (UniqueName: \"kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.389256 master-0 kubenswrapper[23041]: I0308 00:50:54.388346 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbhs\" (UniqueName: \"kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs\") pod \"dnsmasq-dns-6f75dd7cd9-k7sq4\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.518557 master-0 kubenswrapper[23041]: I0308 00:50:54.518492 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:50:54.847228 master-0 kubenswrapper[23041]: I0308 00:50:54.846819 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:50:55.100929 master-0 kubenswrapper[23041]: I0308 00:50:55.100793 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:50:55.122899 master-0 kubenswrapper[23041]: W0308 00:50:55.122834 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4107c15_d018_41f3_ba8d_52fec2bf1397.slice/crio-2aa60f68c2cb565899f7b39a893e2a1f663df64e08515cd54b1f23734028b1b4 WatchSource:0}: Error finding container 2aa60f68c2cb565899f7b39a893e2a1f663df64e08515cd54b1f23734028b1b4: Status 404 returned error can't find the container with id 2aa60f68c2cb565899f7b39a893e2a1f663df64e08515cd54b1f23734028b1b4 Mar 08 00:50:55.798747 master-0 kubenswrapper[23041]: I0308 00:50:55.798660 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" event={"ID":"c4107c15-d018-41f3-ba8d-52fec2bf1397","Type":"ContainerStarted","Data":"2aa60f68c2cb565899f7b39a893e2a1f663df64e08515cd54b1f23734028b1b4"} Mar 08 00:50:55.829086 master-0 kubenswrapper[23041]: I0308 00:50:55.828591 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" event={"ID":"3de682e4-56f1-4c92-870b-795df978e02a","Type":"ContainerStarted","Data":"758161ac7ced28ce447e5853dce7077d40cbdbb06265d411f049d2f17ae29ee5"} Mar 08 00:50:59.627159 master-0 kubenswrapper[23041]: I0308 00:50:59.627046 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 00:50:59.634011 master-0 kubenswrapper[23041]: I0308 00:50:59.633401 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 00:50:59.641223 master-0 kubenswrapper[23041]: I0308 00:50:59.640156 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.641223 master-0 kubenswrapper[23041]: I0308 00:50:59.640231 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-config-data\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.641223 master-0 kubenswrapper[23041]: I0308 00:50:59.640317 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhdf\" (UniqueName: \"kubernetes.io/projected/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kube-api-access-wfhdf\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.641223 master-0 kubenswrapper[23041]: I0308 00:50:59.640335 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kolla-config\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.641223 master-0 kubenswrapper[23041]: I0308 00:50:59.640386 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.667884 master-0 kubenswrapper[23041]: I0308 00:50:59.667835 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 00:50:59.668090 master-0 kubenswrapper[23041]: I0308 00:50:59.667939 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 00:50:59.689440 master-0 kubenswrapper[23041]: I0308 00:50:59.677490 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 00:50:59.690716 master-0 kubenswrapper[23041]: I0308 00:50:59.687108 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 00:50:59.742610 master-0 kubenswrapper[23041]: I0308 00:50:59.742554 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhdf\" (UniqueName: \"kubernetes.io/projected/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kube-api-access-wfhdf\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.742610 master-0 kubenswrapper[23041]: I0308 00:50:59.742609 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kolla-config\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.742854 master-0 kubenswrapper[23041]: I0308 00:50:59.742666 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.742854 master-0 kubenswrapper[23041]: I0308 00:50:59.742723 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.742854 master-0 kubenswrapper[23041]: I0308 00:50:59.742744 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-config-data\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.744522 master-0 kubenswrapper[23041]: I0308 00:50:59.744482 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kolla-config\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.745063 master-0 kubenswrapper[23041]: I0308 00:50:59.745037 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7bd54f8-222a-4144-bfbd-629ba5ad7916-config-data\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.751318 master-0 kubenswrapper[23041]: I0308 00:50:59.751041 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.767074 master-0 kubenswrapper[23041]: I0308 00:50:59.766870 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhdf\" (UniqueName: \"kubernetes.io/projected/d7bd54f8-222a-4144-bfbd-629ba5ad7916-kube-api-access-wfhdf\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:50:59.790862 master-0 kubenswrapper[23041]: I0308 00:50:59.790813 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bd54f8-222a-4144-bfbd-629ba5ad7916-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d7bd54f8-222a-4144-bfbd-629ba5ad7916\") " pod="openstack/memcached-0" Mar 08 00:51:00.007721 master-0 kubenswrapper[23041]: I0308 00:51:00.007659 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 00:51:01.904229 master-0 kubenswrapper[23041]: I0308 00:51:01.900923 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 00:51:01.904229 master-0 kubenswrapper[23041]: I0308 00:51:01.903110 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:01.909211 master-0 kubenswrapper[23041]: I0308 00:51:01.909155 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 00:51:01.909607 master-0 kubenswrapper[23041]: I0308 00:51:01.909559 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 00:51:01.909988 master-0 kubenswrapper[23041]: I0308 00:51:01.909971 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 00:51:01.913230 master-0 kubenswrapper[23041]: I0308 00:51:01.913164 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 00:51:01.913899 master-0 kubenswrapper[23041]: I0308 00:51:01.913876 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 00:51:01.914020 master-0 kubenswrapper[23041]: I0308 00:51:01.914000 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 00:51:01.924233 master-0 kubenswrapper[23041]: I0308 00:51:01.922012 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 00:51:02.099812 master-0 kubenswrapper[23041]: I0308 00:51:02.099747 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100050 master-0 kubenswrapper[23041]: I0308 00:51:02.099885 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100050 master-0 kubenswrapper[23041]: I0308 00:51:02.099932 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100050 master-0 kubenswrapper[23041]: I0308 00:51:02.099958 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/973c5591-ef0e-4a00-9107-bf5c09b1782d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100050 master-0 kubenswrapper[23041]: I0308 00:51:02.100005 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100315 master-0 kubenswrapper[23041]: I0308 00:51:02.100086 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-kube-api-access-vq6hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100315 master-0 kubenswrapper[23041]: I0308 00:51:02.100145 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a589decc-6872-4a81-90a7-55085fdbb47d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7a426619-d4a7-443c-bb92-5b7c4141c59e\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100315 master-0 kubenswrapper[23041]: I0308 00:51:02.100232 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100315 master-0 kubenswrapper[23041]: I0308 00:51:02.100273 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100315 master-0 kubenswrapper[23041]: I0308 00:51:02.100295 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/973c5591-ef0e-4a00-9107-bf5c09b1782d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.100582 master-0 kubenswrapper[23041]: I0308 00:51:02.100372 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204241 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204410 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204442 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204471 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/973c5591-ef0e-4a00-9107-bf5c09b1782d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204541 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204612 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-kube-api-access-vq6hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204644 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a589decc-6872-4a81-90a7-55085fdbb47d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7a426619-d4a7-443c-bb92-5b7c4141c59e\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204702 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204735 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204771 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/973c5591-ef0e-4a00-9107-bf5c09b1782d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.204830 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.205529 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.207306 master-0 kubenswrapper[23041]: I0308 00:51:02.206417 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.209875 master-0 kubenswrapper[23041]: I0308 00:51:02.208327 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.209875 master-0 kubenswrapper[23041]: I0308 00:51:02.208678 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.209875 master-0 kubenswrapper[23041]: I0308 00:51:02.209497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/973c5591-ef0e-4a00-9107-bf5c09b1782d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.212334 master-0 kubenswrapper[23041]: I0308 00:51:02.211310 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:02.212334 master-0 kubenswrapper[23041]: I0308 00:51:02.211371 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a589decc-6872-4a81-90a7-55085fdbb47d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7a426619-d4a7-443c-bb92-5b7c4141c59e\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0752ebba389c623eb3f343d9be2782016044342dae3a475df79faa869d336788/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.215808 master-0 kubenswrapper[23041]: I0308 00:51:02.215342 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.217727 master-0 kubenswrapper[23041]: I0308 00:51:02.217685 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/973c5591-ef0e-4a00-9107-bf5c09b1782d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.230229 master-0 kubenswrapper[23041]: I0308 00:51:02.230160 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.230229 master-0 kubenswrapper[23041]: I0308 00:51:02.230190 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/973c5591-ef0e-4a00-9107-bf5c09b1782d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.233896 master-0 kubenswrapper[23041]: I0308 00:51:02.233864 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/973c5591-ef0e-4a00-9107-bf5c09b1782d-kube-api-access-vq6hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:02.310072 master-0 kubenswrapper[23041]: I0308 00:51:02.310001 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 00:51:02.323891 master-0 kubenswrapper[23041]: I0308 00:51:02.323837 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.330440 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.331346 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.331688 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.331763 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.331781 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 00:51:02.337323 master-0 kubenswrapper[23041]: I0308 00:51:02.331868 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 00:51:02.345416 master-0 kubenswrapper[23041]: I0308 00:51:02.345072 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 00:51:02.509601 master-0 kubenswrapper[23041]: I0308 00:51:02.509536 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.509799 master-0 kubenswrapper[23041]: I0308 00:51:02.509671 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.509799 master-0 kubenswrapper[23041]: I0308 00:51:02.509730 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-config-data\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.509799 master-0 kubenswrapper[23041]: I0308 00:51:02.509763 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.509799 master-0 kubenswrapper[23041]: I0308 00:51:02.509788 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93c6cdb6-c88b-4ed7-afb8-83aa261a592c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^889141d8-6951-4d8f-bb89-f1d43ea6a619\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.509936 master-0 kubenswrapper[23041]: I0308 00:51:02.509848 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gkr\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-kube-api-access-s5gkr\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.510902 master-0 kubenswrapper[23041]: I0308 00:51:02.510839 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e610ec98-66ae-412c-bab9-fab6413ef654-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.510966 master-0 kubenswrapper[23041]: I0308 00:51:02.510905 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.510966 master-0 kubenswrapper[23041]: I0308 00:51:02.510955 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.511079 master-0 kubenswrapper[23041]: I0308 00:51:02.511031 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.511119 master-0 kubenswrapper[23041]: I0308 00:51:02.511107 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e610ec98-66ae-412c-bab9-fab6413ef654-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.615837 master-0 kubenswrapper[23041]: I0308 00:51:02.615790 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e610ec98-66ae-412c-bab9-fab6413ef654-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.616686 master-0 kubenswrapper[23041]: I0308 00:51:02.616648 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.616857 master-0 kubenswrapper[23041]: I0308 00:51:02.616838 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.617049 master-0 kubenswrapper[23041]: I0308 00:51:02.617032 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-config-data\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.617572 master-0 kubenswrapper[23041]: I0308 00:51:02.617360 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.617751 master-0 kubenswrapper[23041]: I0308 00:51:02.617735 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.617865 master-0 kubenswrapper[23041]: I0308 00:51:02.617848 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93c6cdb6-c88b-4ed7-afb8-83aa261a592c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^889141d8-6951-4d8f-bb89-f1d43ea6a619\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.618013 master-0 kubenswrapper[23041]: I0308 00:51:02.617996 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gkr\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-kube-api-access-s5gkr\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.618133 master-0 kubenswrapper[23041]: I0308 00:51:02.618119 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e610ec98-66ae-412c-bab9-fab6413ef654-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.618428 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.618522 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.618618 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.619026 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.619073 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.619407 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-config-data\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620309 master-0 kubenswrapper[23041]: I0308 00:51:02.619830 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e610ec98-66ae-412c-bab9-fab6413ef654-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620996 master-0 kubenswrapper[23041]: I0308 00:51:02.620375 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.620996 master-0 kubenswrapper[23041]: I0308 00:51:02.620646 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e610ec98-66ae-412c-bab9-fab6413ef654-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.624595 master-0 kubenswrapper[23041]: I0308 00:51:02.623869 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:02.624595 master-0 kubenswrapper[23041]: I0308 00:51:02.623912 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93c6cdb6-c88b-4ed7-afb8-83aa261a592c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^889141d8-6951-4d8f-bb89-f1d43ea6a619\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/22eeb9e8ed367fabf9059203c071753d86d2eebdf1b9a30cb753cd8ce3b17f33/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.624595 master-0 kubenswrapper[23041]: I0308 00:51:02.624351 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e610ec98-66ae-412c-bab9-fab6413ef654-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.638435 master-0 kubenswrapper[23041]: I0308 00:51:02.638114 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gkr\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-kube-api-access-s5gkr\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:02.644821 master-0 kubenswrapper[23041]: I0308 00:51:02.644772 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e610ec98-66ae-412c-bab9-fab6413ef654-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:03.866026 master-0 kubenswrapper[23041]: I0308 00:51:03.864720 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8nzj6"] Mar 08 00:51:03.866681 master-0 kubenswrapper[23041]: I0308 00:51:03.866115 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.873026 master-0 kubenswrapper[23041]: I0308 00:51:03.870722 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 00:51:03.873026 master-0 kubenswrapper[23041]: I0308 00:51:03.871074 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 00:51:03.906610 master-0 kubenswrapper[23041]: I0308 00:51:03.906550 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2cgqz"] Mar 08 00:51:03.911101 master-0 kubenswrapper[23041]: I0308 00:51:03.908567 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:03.929664 master-0 kubenswrapper[23041]: I0308 00:51:03.919430 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8nzj6"] Mar 08 00:51:03.974052 master-0 kubenswrapper[23041]: I0308 00:51:03.973997 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-ovn-controller-tls-certs\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.974253 master-0 kubenswrapper[23041]: I0308 00:51:03.974128 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4aeedb5e-326b-4294-a7be-9569b908b49c-scripts\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.974472 master-0 kubenswrapper[23041]: I0308 00:51:03.974441 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-combined-ca-bundle\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.974630 master-0 kubenswrapper[23041]: I0308 00:51:03.974609 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.974705 master-0 kubenswrapper[23041]: I0308 00:51:03.974641 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.980100 master-0 kubenswrapper[23041]: I0308 00:51:03.974750 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-log-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:03.980100 master-0 kubenswrapper[23041]: I0308 00:51:03.974785 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wnw\" (UniqueName: \"kubernetes.io/projected/4aeedb5e-326b-4294-a7be-9569b908b49c-kube-api-access-g8wnw\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.026471 master-0 kubenswrapper[23041]: I0308 00:51:04.026396 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2cgqz"] Mar 08 00:51:04.034330 master-0 kubenswrapper[23041]: I0308 00:51:04.034245 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a589decc-6872-4a81-90a7-55085fdbb47d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7a426619-d4a7-443c-bb92-5b7c4141c59e\") pod \"rabbitmq-cell1-server-0\" (UID: \"973c5591-ef0e-4a00-9107-bf5c09b1782d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:04.059292 master-0 kubenswrapper[23041]: I0308 00:51:04.052239 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077429 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-run\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077503 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077526 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077561 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-log-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077581 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-log\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077598 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wnw\" (UniqueName: \"kubernetes.io/projected/4aeedb5e-326b-4294-a7be-9569b908b49c-kube-api-access-g8wnw\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077656 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-scripts\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077692 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-ovn-controller-tls-certs\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077709 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-lib\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077735 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4aeedb5e-326b-4294-a7be-9569b908b49c-scripts\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077776 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-combined-ca-bundle\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077801 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-etc-ovs\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.077818 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5qd\" (UniqueName: \"kubernetes.io/projected/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-kube-api-access-lm5qd\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.078480 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.078566 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-run\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.078732 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4aeedb5e-326b-4294-a7be-9569b908b49c-var-log-ovn\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.084680 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4aeedb5e-326b-4294-a7be-9569b908b49c-scripts\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.067891 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 00:51:04.096233 master-0 kubenswrapper[23041]: I0308 00:51:04.086274 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-ovn-controller-tls-certs\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.101309 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.106193 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.106938 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.110307 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.118189 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aeedb5e-326b-4294-a7be-9569b908b49c-combined-ca-bundle\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.122314 master-0 kubenswrapper[23041]: I0308 00:51:04.118436 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 00:51:04.155227 master-0 kubenswrapper[23041]: I0308 00:51:04.146808 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wnw\" (UniqueName: \"kubernetes.io/projected/4aeedb5e-326b-4294-a7be-9569b908b49c-kube-api-access-g8wnw\") pod \"ovn-controller-8nzj6\" (UID: \"4aeedb5e-326b-4294-a7be-9569b908b49c\") " pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185398 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-etc-ovs\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185476 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5qd\" (UniqueName: \"kubernetes.io/projected/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-kube-api-access-lm5qd\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185551 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-run\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185612 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-log\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185679 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-scripts\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.185729 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-lib\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.186058 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-lib\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.186222 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-etc-ovs\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.186329 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-log\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.186381 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-var-run\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.197227 master-0 kubenswrapper[23041]: I0308 00:51:04.195903 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:04.201451 master-0 kubenswrapper[23041]: I0308 00:51:04.201310 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-scripts\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.267300 master-0 kubenswrapper[23041]: I0308 00:51:04.263105 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5qd\" (UniqueName: \"kubernetes.io/projected/465cdd3c-22eb-48eb-9d84-be612c4d7f7d-kube-api-access-lm5qd\") pod \"ovn-controller-ovs-2cgqz\" (UID: \"465cdd3c-22eb-48eb-9d84-be612c4d7f7d\") " pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291313 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291397 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpjh\" (UniqueName: \"kubernetes.io/projected/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kube-api-access-tkpjh\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291416 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291440 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291459 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291474 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291502 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ca0b5327-3d55-4f6b-9b59-09ca4533e9e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bf982391-390b-4c43-836d-76ed7a53130b\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291558 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.293253 master-0 kubenswrapper[23041]: I0308 00:51:04.291902 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392645 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392728 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpjh\" (UniqueName: \"kubernetes.io/projected/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kube-api-access-tkpjh\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392749 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392771 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392790 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.392802 master-0 kubenswrapper[23041]: I0308 00:51:04.392807 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.393282 master-0 kubenswrapper[23041]: I0308 00:51:04.392843 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ca0b5327-3d55-4f6b-9b59-09ca4533e9e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bf982391-390b-4c43-836d-76ed7a53130b\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.393282 master-0 kubenswrapper[23041]: I0308 00:51:04.392906 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.394603 master-0 kubenswrapper[23041]: I0308 00:51:04.393795 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.396655 master-0 kubenswrapper[23041]: I0308 00:51:04.396590 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.397083 master-0 kubenswrapper[23041]: I0308 00:51:04.397044 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.399167 master-0 kubenswrapper[23041]: I0308 00:51:04.399065 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2a4deb4-466b-499d-a4c5-227ae1726fc9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.399722 master-0 kubenswrapper[23041]: I0308 00:51:04.399657 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2a4deb4-466b-499d-a4c5-227ae1726fc9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.399859 master-0 kubenswrapper[23041]: I0308 00:51:04.399812 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2a4deb4-466b-499d-a4c5-227ae1726fc9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.402874 master-0 kubenswrapper[23041]: I0308 00:51:04.401952 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:04.402874 master-0 kubenswrapper[23041]: I0308 00:51:04.401985 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ca0b5327-3d55-4f6b-9b59-09ca4533e9e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bf982391-390b-4c43-836d-76ed7a53130b\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a59d7a712efd556479f28c18bb473096730f4d6083b9a8fe8f984dc32c7a1e73/globalmount\"" pod="openstack/openstack-galera-0" Mar 08 00:51:04.416940 master-0 kubenswrapper[23041]: I0308 00:51:04.416883 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpjh\" (UniqueName: \"kubernetes.io/projected/b2a4deb4-466b-499d-a4c5-227ae1726fc9-kube-api-access-tkpjh\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:04.945194 master-0 kubenswrapper[23041]: I0308 00:51:04.943747 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 00:51:04.951314 master-0 kubenswrapper[23041]: I0308 00:51:04.951083 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:04.955762 master-0 kubenswrapper[23041]: I0308 00:51:04.954825 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 00:51:04.955762 master-0 kubenswrapper[23041]: I0308 00:51:04.955218 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 00:51:04.955762 master-0 kubenswrapper[23041]: I0308 00:51:04.955320 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 00:51:04.965456 master-0 kubenswrapper[23041]: I0308 00:51:04.965410 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 00:51:05.108503 master-0 kubenswrapper[23041]: I0308 00:51:05.108360 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.108503 master-0 kubenswrapper[23041]: I0308 00:51:05.108425 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.108503 master-0 kubenswrapper[23041]: I0308 00:51:05.108446 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvnnm\" (UniqueName: \"kubernetes.io/projected/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kube-api-access-gvnnm\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.108890 master-0 kubenswrapper[23041]: I0308 00:51:05.108815 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.109177 master-0 kubenswrapper[23041]: I0308 00:51:05.109056 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.109313 master-0 kubenswrapper[23041]: I0308 00:51:05.109277 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e4151f20-922d-4c86-a308-b1eab7265e73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ff2be1c1-091e-42ea-b889-0d0399ae45a9\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.109362 master-0 kubenswrapper[23041]: I0308 00:51:05.109330 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.109632 master-0 kubenswrapper[23041]: I0308 00:51:05.109602 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212358 master-0 kubenswrapper[23041]: I0308 00:51:05.212282 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212412 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212443 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212463 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvnnm\" (UniqueName: \"kubernetes.io/projected/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kube-api-access-gvnnm\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212491 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212509 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212550 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e4151f20-922d-4c86-a308-b1eab7265e73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ff2be1c1-091e-42ea-b889-0d0399ae45a9\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.212597 master-0 kubenswrapper[23041]: I0308 00:51:05.212577 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.213952 master-0 kubenswrapper[23041]: I0308 00:51:05.213921 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.214592 master-0 kubenswrapper[23041]: I0308 00:51:05.214566 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d63ce8bf-b1c8-44a8-92f2-298f046dc138-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.216284 master-0 kubenswrapper[23041]: I0308 00:51:05.216168 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.217100 master-0 kubenswrapper[23041]: I0308 00:51:05.217071 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63ce8bf-b1c8-44a8-92f2-298f046dc138-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.217512 master-0 kubenswrapper[23041]: I0308 00:51:05.217475 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.218238 master-0 kubenswrapper[23041]: I0308 00:51:05.218177 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:05.218298 master-0 kubenswrapper[23041]: I0308 00:51:05.218241 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e4151f20-922d-4c86-a308-b1eab7265e73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ff2be1c1-091e-42ea-b889-0d0399ae45a9\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4c3418ff86d7282893ff8086176694e4caab774f33deb492f2ed95c94c5b0fa2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.228101 master-0 kubenswrapper[23041]: I0308 00:51:05.228052 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63ce8bf-b1c8-44a8-92f2-298f046dc138-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.231345 master-0 kubenswrapper[23041]: I0308 00:51:05.231319 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvnnm\" (UniqueName: \"kubernetes.io/projected/d63ce8bf-b1c8-44a8-92f2-298f046dc138-kube-api-access-gvnnm\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:05.466369 master-0 kubenswrapper[23041]: I0308 00:51:05.466317 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93c6cdb6-c88b-4ed7-afb8-83aa261a592c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^889141d8-6951-4d8f-bb89-f1d43ea6a619\") pod \"rabbitmq-server-0\" (UID: \"e610ec98-66ae-412c-bab9-fab6413ef654\") " pod="openstack/rabbitmq-server-0" Mar 08 00:51:05.663919 master-0 kubenswrapper[23041]: I0308 00:51:05.663827 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 00:51:06.632587 master-0 kubenswrapper[23041]: I0308 00:51:06.632492 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ca0b5327-3d55-4f6b-9b59-09ca4533e9e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bf982391-390b-4c43-836d-76ed7a53130b\") pod \"openstack-galera-0\" (UID: \"b2a4deb4-466b-499d-a4c5-227ae1726fc9\") " pod="openstack/openstack-galera-0" Mar 08 00:51:06.908424 master-0 kubenswrapper[23041]: I0308 00:51:06.908278 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 00:51:07.711294 master-0 kubenswrapper[23041]: I0308 00:51:07.710122 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e4151f20-922d-4c86-a308-b1eab7265e73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ff2be1c1-091e-42ea-b889-0d0399ae45a9\") pod \"openstack-cell1-galera-0\" (UID: \"d63ce8bf-b1c8-44a8-92f2-298f046dc138\") " pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:07.721289 master-0 kubenswrapper[23041]: I0308 00:51:07.721225 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:07.869028 master-0 kubenswrapper[23041]: I0308 00:51:07.868932 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 00:51:07.871074 master-0 kubenswrapper[23041]: I0308 00:51:07.871034 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.882015 master-0 kubenswrapper[23041]: I0308 00:51:07.872662 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 00:51:07.882015 master-0 kubenswrapper[23041]: I0308 00:51:07.873149 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 00:51:07.882015 master-0 kubenswrapper[23041]: I0308 00:51:07.873292 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 00:51:07.882015 master-0 kubenswrapper[23041]: I0308 00:51:07.873405 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 00:51:07.882015 master-0 kubenswrapper[23041]: I0308 00:51:07.876573 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 00:51:07.985462 master-0 kubenswrapper[23041]: I0308 00:51:07.985374 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985462 master-0 kubenswrapper[23041]: I0308 00:51:07.985460 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985751 master-0 kubenswrapper[23041]: I0308 00:51:07.985511 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985751 master-0 kubenswrapper[23041]: I0308 00:51:07.985536 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a989983c-76c3-46c8-a8a4-81bcbb7ed698\" (UniqueName: \"kubernetes.io/csi/topolvm.io^28564532-52df-4765-b179-aad439a5f5da\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985751 master-0 kubenswrapper[23041]: I0308 00:51:07.985621 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be385568-7616-43bb-b89a-6478d7c995a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985894 master-0 kubenswrapper[23041]: I0308 00:51:07.985864 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tqw\" (UniqueName: \"kubernetes.io/projected/be385568-7616-43bb-b89a-6478d7c995a4-kube-api-access-h6tqw\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985954 master-0 kubenswrapper[23041]: I0308 00:51:07.985921 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:07.985993 master-0 kubenswrapper[23041]: I0308 00:51:07.985973 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088215 master-0 kubenswrapper[23041]: I0308 00:51:08.088127 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tqw\" (UniqueName: \"kubernetes.io/projected/be385568-7616-43bb-b89a-6478d7c995a4-kube-api-access-h6tqw\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088215 master-0 kubenswrapper[23041]: I0308 00:51:08.088226 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088483 master-0 kubenswrapper[23041]: I0308 00:51:08.088265 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088483 master-0 kubenswrapper[23041]: I0308 00:51:08.088397 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088483 master-0 kubenswrapper[23041]: I0308 00:51:08.088432 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088483 master-0 kubenswrapper[23041]: I0308 00:51:08.088480 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088602 master-0 kubenswrapper[23041]: I0308 00:51:08.088507 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a989983c-76c3-46c8-a8a4-81bcbb7ed698\" (UniqueName: \"kubernetes.io/csi/topolvm.io^28564532-52df-4765-b179-aad439a5f5da\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.088602 master-0 kubenswrapper[23041]: I0308 00:51:08.088533 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be385568-7616-43bb-b89a-6478d7c995a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.089258 master-0 kubenswrapper[23041]: I0308 00:51:08.089020 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be385568-7616-43bb-b89a-6478d7c995a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.090486 master-0 kubenswrapper[23041]: I0308 00:51:08.090330 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.091703 master-0 kubenswrapper[23041]: I0308 00:51:08.091657 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be385568-7616-43bb-b89a-6478d7c995a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.092286 master-0 kubenswrapper[23041]: I0308 00:51:08.092183 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.097723 master-0 kubenswrapper[23041]: I0308 00:51:08.097691 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:08.097809 master-0 kubenswrapper[23041]: I0308 00:51:08.097736 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a989983c-76c3-46c8-a8a4-81bcbb7ed698\" (UniqueName: \"kubernetes.io/csi/topolvm.io^28564532-52df-4765-b179-aad439a5f5da\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2fd8e256c88d344c6c050f9b6cd69d415e4b0886ff71dc8390ff4f6ce8f1b9e6/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.098254 master-0 kubenswrapper[23041]: I0308 00:51:08.098215 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.099015 master-0 kubenswrapper[23041]: I0308 00:51:08.098966 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be385568-7616-43bb-b89a-6478d7c995a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.107558 master-0 kubenswrapper[23041]: I0308 00:51:08.107522 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tqw\" (UniqueName: \"kubernetes.io/projected/be385568-7616-43bb-b89a-6478d7c995a4-kube-api-access-h6tqw\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:08.478971 master-0 kubenswrapper[23041]: I0308 00:51:08.478922 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 00:51:08.480483 master-0 kubenswrapper[23041]: I0308 00:51:08.480458 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.485988 master-0 kubenswrapper[23041]: I0308 00:51:08.485478 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 00:51:08.486940 master-0 kubenswrapper[23041]: I0308 00:51:08.486907 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 00:51:08.507857 master-0 kubenswrapper[23041]: I0308 00:51:08.507105 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 00:51:08.509492 master-0 kubenswrapper[23041]: I0308 00:51:08.508932 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 00:51:08.612572 master-0 kubenswrapper[23041]: I0308 00:51:08.612516 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.612850 master-0 kubenswrapper[23041]: I0308 00:51:08.612754 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613226 master-0 kubenswrapper[23041]: I0308 00:51:08.613191 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613326 master-0 kubenswrapper[23041]: I0308 00:51:08.613286 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613492 master-0 kubenswrapper[23041]: I0308 00:51:08.613466 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613542 master-0 kubenswrapper[23041]: I0308 00:51:08.613503 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613575 master-0 kubenswrapper[23041]: I0308 00:51:08.613543 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aac4b125-e02e-425b-a5fe-e4b813eecbc8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2aed2aa-806d-4b86-a116-319a7017aa14\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.613614 master-0 kubenswrapper[23041]: I0308 00:51:08.613571 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774s5\" (UniqueName: \"kubernetes.io/projected/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-kube-api-access-774s5\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716310 master-0 kubenswrapper[23041]: I0308 00:51:08.716253 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716366 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716399 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716429 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aac4b125-e02e-425b-a5fe-e4b813eecbc8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2aed2aa-806d-4b86-a116-319a7017aa14\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716459 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774s5\" (UniqueName: \"kubernetes.io/projected/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-kube-api-access-774s5\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716545 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716612 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716646 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.716840 master-0 kubenswrapper[23041]: I0308 00:51:08.716676 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.718047 master-0 kubenswrapper[23041]: I0308 00:51:08.718011 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:08.718543 master-0 kubenswrapper[23041]: I0308 00:51:08.718051 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aac4b125-e02e-425b-a5fe-e4b813eecbc8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2aed2aa-806d-4b86-a116-319a7017aa14\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/565aeff52b24bfd6f0e1b3d11148b9c13cb84d74c061052346fa574e3ab378b9/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.718707 master-0 kubenswrapper[23041]: I0308 00:51:08.718573 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.718789 master-0 kubenswrapper[23041]: I0308 00:51:08.718752 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.724251 master-0 kubenswrapper[23041]: I0308 00:51:08.724187 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.726637 master-0 kubenswrapper[23041]: I0308 00:51:08.726600 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.734753 master-0 kubenswrapper[23041]: I0308 00:51:08.734646 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:08.735976 master-0 kubenswrapper[23041]: I0308 00:51:08.735948 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774s5\" (UniqueName: \"kubernetes.io/projected/d7e56cd4-d944-40ae-b1e9-2ca3551c93f4-kube-api-access-774s5\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:09.463840 master-0 kubenswrapper[23041]: I0308 00:51:09.463792 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a989983c-76c3-46c8-a8a4-81bcbb7ed698\" (UniqueName: \"kubernetes.io/csi/topolvm.io^28564532-52df-4765-b179-aad439a5f5da\") pod \"ovsdbserver-sb-0\" (UID: \"be385568-7616-43bb-b89a-6478d7c995a4\") " pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:09.709430 master-0 kubenswrapper[23041]: I0308 00:51:09.709292 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:10.814233 master-0 kubenswrapper[23041]: I0308 00:51:10.814171 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aac4b125-e02e-425b-a5fe-e4b813eecbc8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2aed2aa-806d-4b86-a116-319a7017aa14\") pod \"ovsdbserver-nb-0\" (UID: \"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4\") " pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:10.920751 master-0 kubenswrapper[23041]: I0308 00:51:10.920656 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:13.653237 master-0 kubenswrapper[23041]: I0308 00:51:13.652888 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 00:51:13.945105 master-0 kubenswrapper[23041]: W0308 00:51:13.942348 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode610ec98_66ae_412c_bab9_fab6413ef654.slice/crio-980a58ec79e43c81c8832deedf99f0f7d66c4e9d7e78d4733a9b12901c056d15 WatchSource:0}: Error finding container 980a58ec79e43c81c8832deedf99f0f7d66c4e9d7e78d4733a9b12901c056d15: Status 404 returned error can't find the container with id 980a58ec79e43c81c8832deedf99f0f7d66c4e9d7e78d4733a9b12901c056d15 Mar 08 00:51:14.122249 master-0 kubenswrapper[23041]: I0308 00:51:14.122182 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 00:51:14.126139 master-0 kubenswrapper[23041]: W0308 00:51:14.126095 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bd54f8_222a_4144_bfbd_629ba5ad7916.slice/crio-137dc622aa14131999edd1f074b0220571171409ee5d7c4698606979a1619483 WatchSource:0}: Error finding container 137dc622aa14131999edd1f074b0220571171409ee5d7c4698606979a1619483: Status 404 returned error can't find the container with id 137dc622aa14131999edd1f074b0220571171409ee5d7c4698606979a1619483 Mar 08 00:51:14.316692 master-0 kubenswrapper[23041]: I0308 00:51:14.316623 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 00:51:14.362405 master-0 kubenswrapper[23041]: I0308 00:51:14.362344 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8nzj6"] Mar 08 00:51:14.374145 master-0 kubenswrapper[23041]: I0308 00:51:14.372857 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 00:51:14.553589 master-0 kubenswrapper[23041]: W0308 00:51:14.553501 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd63ce8bf_b1c8_44a8_92f2_298f046dc138.slice/crio-83dcbab4f2f9351cc6eb17a8df0686f85316056aa3b543ffe60079c099282891 WatchSource:0}: Error finding container 83dcbab4f2f9351cc6eb17a8df0686f85316056aa3b543ffe60079c099282891: Status 404 returned error can't find the container with id 83dcbab4f2f9351cc6eb17a8df0686f85316056aa3b543ffe60079c099282891 Mar 08 00:51:14.569724 master-0 kubenswrapper[23041]: I0308 00:51:14.568353 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 00:51:14.583067 master-0 kubenswrapper[23041]: I0308 00:51:14.583009 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 00:51:14.624420 master-0 kubenswrapper[23041]: I0308 00:51:14.624359 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2cgqz"] Mar 08 00:51:14.653895 master-0 kubenswrapper[23041]: I0308 00:51:14.653832 23041 generic.go:334] "Generic (PLEG): container finished" podID="3de682e4-56f1-4c92-870b-795df978e02a" containerID="c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f" exitCode=0 Mar 08 00:51:14.654829 master-0 kubenswrapper[23041]: I0308 00:51:14.654802 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" event={"ID":"3de682e4-56f1-4c92-870b-795df978e02a","Type":"ContainerDied","Data":"c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f"} Mar 08 00:51:14.662794 master-0 kubenswrapper[23041]: I0308 00:51:14.658235 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e610ec98-66ae-412c-bab9-fab6413ef654","Type":"ContainerStarted","Data":"980a58ec79e43c81c8832deedf99f0f7d66c4e9d7e78d4733a9b12901c056d15"} Mar 08 00:51:14.662794 master-0 kubenswrapper[23041]: I0308 00:51:14.661855 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d7bd54f8-222a-4144-bfbd-629ba5ad7916","Type":"ContainerStarted","Data":"137dc622aa14131999edd1f074b0220571171409ee5d7c4698606979a1619483"} Mar 08 00:51:14.668525 master-0 kubenswrapper[23041]: I0308 00:51:14.668472 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2a4deb4-466b-499d-a4c5-227ae1726fc9","Type":"ContainerStarted","Data":"e440296332a920af9d7073d01e335d00d6535ca02c70d886497b20f43dff3230"} Mar 08 00:51:14.674774 master-0 kubenswrapper[23041]: I0308 00:51:14.674486 23041 generic.go:334] "Generic (PLEG): container finished" podID="3cd4b2be-f3d5-4eec-8061-2055f3c4b001" containerID="13f1c47c54881c2111c25c0474dd97e515b537ba1f7c467e538ac4e4c4786cf3" exitCode=0 Mar 08 00:51:14.674974 master-0 kubenswrapper[23041]: I0308 00:51:14.674949 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" event={"ID":"3cd4b2be-f3d5-4eec-8061-2055f3c4b001","Type":"ContainerDied","Data":"13f1c47c54881c2111c25c0474dd97e515b537ba1f7c467e538ac4e4c4786cf3"} Mar 08 00:51:14.696176 master-0 kubenswrapper[23041]: I0308 00:51:14.696119 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"973c5591-ef0e-4a00-9107-bf5c09b1782d","Type":"ContainerStarted","Data":"dd519c444c5691da3bf81e0c30fe53f3f9f01c254ad26a0fb0ff0b1378c3482e"} Mar 08 00:51:14.704182 master-0 kubenswrapper[23041]: I0308 00:51:14.703692 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2cgqz" event={"ID":"465cdd3c-22eb-48eb-9d84-be612c4d7f7d","Type":"ContainerStarted","Data":"420dbbce71eb50f337103027421b50f33e54761a5fd2b7a895182608cc5bf599"} Mar 08 00:51:14.710790 master-0 kubenswrapper[23041]: I0308 00:51:14.710379 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4","Type":"ContainerStarted","Data":"d62a0141adb9191f2922a23af588c27b3bb0651bcfe514388a6a99625f47e17d"} Mar 08 00:51:14.720876 master-0 kubenswrapper[23041]: I0308 00:51:14.720806 23041 generic.go:334] "Generic (PLEG): container finished" podID="d3ab1df0-49b3-4091-89df-9a708f833de7" containerID="1952d6955dda6b4efa2b664f18d70678762381df65e82265e0fe9c630828bb9e" exitCode=0 Mar 08 00:51:14.721073 master-0 kubenswrapper[23041]: I0308 00:51:14.720979 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" event={"ID":"d3ab1df0-49b3-4091-89df-9a708f833de7","Type":"ContainerDied","Data":"1952d6955dda6b4efa2b664f18d70678762381df65e82265e0fe9c630828bb9e"} Mar 08 00:51:14.726102 master-0 kubenswrapper[23041]: I0308 00:51:14.725990 23041 generic.go:334] "Generic (PLEG): container finished" podID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerID="5babef6bc80766e64966245df29a902f1a6f8cd2d7a965f252c4a71b12f89d21" exitCode=0 Mar 08 00:51:14.726102 master-0 kubenswrapper[23041]: I0308 00:51:14.726073 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" event={"ID":"c4107c15-d018-41f3-ba8d-52fec2bf1397","Type":"ContainerDied","Data":"5babef6bc80766e64966245df29a902f1a6f8cd2d7a965f252c4a71b12f89d21"} Mar 08 00:51:14.782019 master-0 kubenswrapper[23041]: I0308 00:51:14.781940 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8nzj6" event={"ID":"4aeedb5e-326b-4294-a7be-9569b908b49c","Type":"ContainerStarted","Data":"0da7e8e63b0c7c6e7b2064944d3927b72963f95c6f524c9d9ee48de6ddd2d4b9"} Mar 08 00:51:14.782019 master-0 kubenswrapper[23041]: I0308 00:51:14.782014 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d63ce8bf-b1c8-44a8-92f2-298f046dc138","Type":"ContainerStarted","Data":"83dcbab4f2f9351cc6eb17a8df0686f85316056aa3b543ffe60079c099282891"} Mar 08 00:51:15.343446 master-0 kubenswrapper[23041]: I0308 00:51:15.332977 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 00:51:15.371362 master-0 kubenswrapper[23041]: I0308 00:51:15.368027 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:51:15.522506 master-0 kubenswrapper[23041]: I0308 00:51:15.522431 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dltz\" (UniqueName: \"kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz\") pod \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " Mar 08 00:51:15.522855 master-0 kubenswrapper[23041]: I0308 00:51:15.522808 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config\") pod \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\" (UID: \"3cd4b2be-f3d5-4eec-8061-2055f3c4b001\") " Mar 08 00:51:15.552124 master-0 kubenswrapper[23041]: I0308 00:51:15.550601 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz" (OuterVolumeSpecName: "kube-api-access-4dltz") pod "3cd4b2be-f3d5-4eec-8061-2055f3c4b001" (UID: "3cd4b2be-f3d5-4eec-8061-2055f3c4b001"). InnerVolumeSpecName "kube-api-access-4dltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.631741 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dltz\" (UniqueName: \"kubernetes.io/projected/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-kube-api-access-4dltz\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.793110 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" event={"ID":"3de682e4-56f1-4c92-870b-795df978e02a","Type":"ContainerStarted","Data":"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69"} Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.794103 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.798989 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" event={"ID":"c4107c15-d018-41f3-ba8d-52fec2bf1397","Type":"ContainerStarted","Data":"43fb29ba9b85458fb594b814bc115f22ce4a72799e56b1d8895270734c06d9fa"} Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.799523 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.801888 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be385568-7616-43bb-b89a-6478d7c995a4","Type":"ContainerStarted","Data":"b72bd9502dd74abb29da3c4bb248d3634a913a0346640bfd52852269a6ed64e4"} Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.803463 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" event={"ID":"3cd4b2be-f3d5-4eec-8061-2055f3c4b001","Type":"ContainerDied","Data":"739ad4863156a7e2adee37681cc0f51b19625992c73df294cdca14343bb6a83e"} Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.803489 23041 scope.go:117] "RemoveContainer" containerID="13f1c47c54881c2111c25c0474dd97e515b537ba1f7c467e538ac4e4c4786cf3" Mar 08 00:51:16.117483 master-0 kubenswrapper[23041]: I0308 00:51:15.803581 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-pnrh6" Mar 08 00:51:16.176845 master-0 kubenswrapper[23041]: I0308 00:51:16.174032 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config" (OuterVolumeSpecName: "config") pod "3cd4b2be-f3d5-4eec-8061-2055f3c4b001" (UID: "3cd4b2be-f3d5-4eec-8061-2055f3c4b001"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:16.236387 master-0 kubenswrapper[23041]: I0308 00:51:16.235704 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" podStartSLOduration=4.759011049 podStartE2EDuration="23.235680562s" podCreationTimestamp="2026-03-08 00:50:53 +0000 UTC" firstStartedPulling="2026-03-08 00:50:54.823491804 +0000 UTC m=+1160.296328368" lastFinishedPulling="2026-03-08 00:51:13.300161327 +0000 UTC m=+1178.772997881" observedRunningTime="2026-03-08 00:51:16.208619482 +0000 UTC m=+1181.681456046" watchObservedRunningTime="2026-03-08 00:51:16.235680562 +0000 UTC m=+1181.708517106" Mar 08 00:51:16.251416 master-0 kubenswrapper[23041]: I0308 00:51:16.251370 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:51:16.260445 master-0 kubenswrapper[23041]: I0308 00:51:16.258322 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" podStartSLOduration=4.1099718 podStartE2EDuration="22.258298905s" podCreationTimestamp="2026-03-08 00:50:54 +0000 UTC" firstStartedPulling="2026-03-08 00:50:55.125094389 +0000 UTC m=+1160.597930943" lastFinishedPulling="2026-03-08 00:51:13.273421484 +0000 UTC m=+1178.746258048" observedRunningTime="2026-03-08 00:51:16.231564002 +0000 UTC m=+1181.704400786" watchObservedRunningTime="2026-03-08 00:51:16.258298905 +0000 UTC m=+1181.731135459" Mar 08 00:51:16.280586 master-0 kubenswrapper[23041]: I0308 00:51:16.270287 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wrb5\" (UniqueName: \"kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5\") pod \"d3ab1df0-49b3-4091-89df-9a708f833de7\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " Mar 08 00:51:16.280586 master-0 kubenswrapper[23041]: I0308 00:51:16.270604 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config\") pod \"d3ab1df0-49b3-4091-89df-9a708f833de7\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " Mar 08 00:51:16.280586 master-0 kubenswrapper[23041]: I0308 00:51:16.270706 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc\") pod \"d3ab1df0-49b3-4091-89df-9a708f833de7\" (UID: \"d3ab1df0-49b3-4091-89df-9a708f833de7\") " Mar 08 00:51:16.288312 master-0 kubenswrapper[23041]: I0308 00:51:16.288190 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5" (OuterVolumeSpecName: "kube-api-access-7wrb5") pod "d3ab1df0-49b3-4091-89df-9a708f833de7" (UID: "d3ab1df0-49b3-4091-89df-9a708f833de7"). InnerVolumeSpecName "kube-api-access-7wrb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:16.304317 master-0 kubenswrapper[23041]: I0308 00:51:16.304162 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wrb5\" (UniqueName: \"kubernetes.io/projected/d3ab1df0-49b3-4091-89df-9a708f833de7-kube-api-access-7wrb5\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:16.304317 master-0 kubenswrapper[23041]: I0308 00:51:16.304311 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd4b2be-f3d5-4eec-8061-2055f3c4b001-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:16.327284 master-0 kubenswrapper[23041]: I0308 00:51:16.327229 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3ab1df0-49b3-4091-89df-9a708f833de7" (UID: "d3ab1df0-49b3-4091-89df-9a708f833de7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:16.334019 master-0 kubenswrapper[23041]: I0308 00:51:16.333929 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config" (OuterVolumeSpecName: "config") pod "d3ab1df0-49b3-4091-89df-9a708f833de7" (UID: "d3ab1df0-49b3-4091-89df-9a708f833de7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:16.415837 master-0 kubenswrapper[23041]: I0308 00:51:16.415696 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:16.415837 master-0 kubenswrapper[23041]: I0308 00:51:16.415746 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ab1df0-49b3-4091-89df-9a708f833de7-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:16.552870 master-0 kubenswrapper[23041]: I0308 00:51:16.524888 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:51:16.600909 master-0 kubenswrapper[23041]: I0308 00:51:16.598690 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-pnrh6"] Mar 08 00:51:16.839215 master-0 kubenswrapper[23041]: I0308 00:51:16.837891 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" Mar 08 00:51:16.860243 master-0 kubenswrapper[23041]: I0308 00:51:16.860145 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd4b2be-f3d5-4eec-8061-2055f3c4b001" path="/var/lib/kubelet/pods/3cd4b2be-f3d5-4eec-8061-2055f3c4b001/volumes" Mar 08 00:51:16.866859 master-0 kubenswrapper[23041]: I0308 00:51:16.866690 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-rqt5m" event={"ID":"d3ab1df0-49b3-4091-89df-9a708f833de7","Type":"ContainerDied","Data":"7c5f15e7bae7b35f01623b67b96ce7d180d8186b87a774d116ea827da6c4a487"} Mar 08 00:51:16.866859 master-0 kubenswrapper[23041]: I0308 00:51:16.866783 23041 scope.go:117] "RemoveContainer" containerID="1952d6955dda6b4efa2b664f18d70678762381df65e82265e0fe9c630828bb9e" Mar 08 00:51:16.962427 master-0 kubenswrapper[23041]: I0308 00:51:16.962349 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:51:17.020088 master-0 kubenswrapper[23041]: I0308 00:51:17.019651 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-rqt5m"] Mar 08 00:51:18.826365 master-0 kubenswrapper[23041]: I0308 00:51:18.826307 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ab1df0-49b3-4091-89df-9a708f833de7" path="/var/lib/kubelet/pods/d3ab1df0-49b3-4091-89df-9a708f833de7/volumes" Mar 08 00:51:24.174483 master-0 kubenswrapper[23041]: I0308 00:51:24.174421 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:51:24.523563 master-0 kubenswrapper[23041]: I0308 00:51:24.523363 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:51:24.606165 master-0 kubenswrapper[23041]: I0308 00:51:24.606025 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:51:24.933286 master-0 kubenswrapper[23041]: I0308 00:51:24.932130 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4","Type":"ContainerStarted","Data":"cf37efdd11a5a9d0fb342cbd19ed77aa2c4ccd6e50212e5c4d2f93063adc1e6c"} Mar 08 00:51:24.934975 master-0 kubenswrapper[23041]: I0308 00:51:24.934710 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8nzj6" event={"ID":"4aeedb5e-326b-4294-a7be-9569b908b49c","Type":"ContainerStarted","Data":"335350f12d015fce2b5c44d688a12d380a9fb139c16900008bb3cabb73ea0fad"} Mar 08 00:51:24.935710 master-0 kubenswrapper[23041]: I0308 00:51:24.935620 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8nzj6" Mar 08 00:51:24.943328 master-0 kubenswrapper[23041]: I0308 00:51:24.943281 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d7bd54f8-222a-4144-bfbd-629ba5ad7916","Type":"ContainerStarted","Data":"509756ad0b321e796d331e4795e76237b98871dd7e7f12dad471dcf5fcab0ce4"} Mar 08 00:51:24.943597 master-0 kubenswrapper[23041]: I0308 00:51:24.943525 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 00:51:24.946811 master-0 kubenswrapper[23041]: I0308 00:51:24.946760 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be385568-7616-43bb-b89a-6478d7c995a4","Type":"ContainerStarted","Data":"36f01162b6315f08e2ba27e597bd365a97ee22546a011cb48b20a40a42c18eaf"} Mar 08 00:51:24.952287 master-0 kubenswrapper[23041]: I0308 00:51:24.951985 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d63ce8bf-b1c8-44a8-92f2-298f046dc138","Type":"ContainerStarted","Data":"f0c47b170e74995a968b22646eac77ccb435cd019d758421001c12b127fc7f96"} Mar 08 00:51:24.961264 master-0 kubenswrapper[23041]: I0308 00:51:24.960847 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8nzj6" podStartSLOduration=12.766916075 podStartE2EDuration="21.960826008s" podCreationTimestamp="2026-03-08 00:51:03 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.382681658 +0000 UTC m=+1179.855518212" lastFinishedPulling="2026-03-08 00:51:23.576591591 +0000 UTC m=+1189.049428145" observedRunningTime="2026-03-08 00:51:24.960177063 +0000 UTC m=+1190.433013617" watchObservedRunningTime="2026-03-08 00:51:24.960826008 +0000 UTC m=+1190.433662562" Mar 08 00:51:24.965536 master-0 kubenswrapper[23041]: I0308 00:51:24.965488 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2a4deb4-466b-499d-a4c5-227ae1726fc9","Type":"ContainerStarted","Data":"1d5ca774c1202e929b763c30656834b22e983a8ab9418b7749d198b4840db018"} Mar 08 00:51:24.976223 master-0 kubenswrapper[23041]: I0308 00:51:24.974366 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="dnsmasq-dns" containerID="cri-o://c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69" gracePeriod=10 Mar 08 00:51:24.976223 master-0 kubenswrapper[23041]: I0308 00:51:24.974472 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2cgqz" event={"ID":"465cdd3c-22eb-48eb-9d84-be612c4d7f7d","Type":"ContainerStarted","Data":"3b58efb0accb35f646da31f1a88629f3b3d4637c959d1d77a36a723f8cc30bb2"} Mar 08 00:51:25.049716 master-0 kubenswrapper[23041]: I0308 00:51:25.049625 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.616186395 podStartE2EDuration="26.049607176s" podCreationTimestamp="2026-03-08 00:50:59 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.129086456 +0000 UTC m=+1179.601923010" lastFinishedPulling="2026-03-08 00:51:23.562507237 +0000 UTC m=+1189.035343791" observedRunningTime="2026-03-08 00:51:25.015949304 +0000 UTC m=+1190.488785858" watchObservedRunningTime="2026-03-08 00:51:25.049607176 +0000 UTC m=+1190.522443730" Mar 08 00:51:25.749623 master-0 kubenswrapper[23041]: I0308 00:51:25.748437 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:51:25.869227 master-0 kubenswrapper[23041]: I0308 00:51:25.868140 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config\") pod \"3de682e4-56f1-4c92-870b-795df978e02a\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " Mar 08 00:51:25.869227 master-0 kubenswrapper[23041]: I0308 00:51:25.868621 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc\") pod \"3de682e4-56f1-4c92-870b-795df978e02a\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " Mar 08 00:51:25.869227 master-0 kubenswrapper[23041]: I0308 00:51:25.868714 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zg78\" (UniqueName: \"kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78\") pod \"3de682e4-56f1-4c92-870b-795df978e02a\" (UID: \"3de682e4-56f1-4c92-870b-795df978e02a\") " Mar 08 00:51:25.873532 master-0 kubenswrapper[23041]: I0308 00:51:25.871915 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78" (OuterVolumeSpecName: "kube-api-access-8zg78") pod "3de682e4-56f1-4c92-870b-795df978e02a" (UID: "3de682e4-56f1-4c92-870b-795df978e02a"). InnerVolumeSpecName "kube-api-access-8zg78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:25.907738 master-0 kubenswrapper[23041]: I0308 00:51:25.907614 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config" (OuterVolumeSpecName: "config") pod "3de682e4-56f1-4c92-870b-795df978e02a" (UID: "3de682e4-56f1-4c92-870b-795df978e02a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:25.935180 master-0 kubenswrapper[23041]: I0308 00:51:25.935122 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3de682e4-56f1-4c92-870b-795df978e02a" (UID: "3de682e4-56f1-4c92-870b-795df978e02a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:25.972796 master-0 kubenswrapper[23041]: I0308 00:51:25.972628 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:25.972796 master-0 kubenswrapper[23041]: I0308 00:51:25.972682 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de682e4-56f1-4c92-870b-795df978e02a-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:25.972796 master-0 kubenswrapper[23041]: I0308 00:51:25.972697 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zg78\" (UniqueName: \"kubernetes.io/projected/3de682e4-56f1-4c92-870b-795df978e02a-kube-api-access-8zg78\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:25.990168 master-0 kubenswrapper[23041]: I0308 00:51:25.990080 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"973c5591-ef0e-4a00-9107-bf5c09b1782d","Type":"ContainerStarted","Data":"7f556a88094c105c61e23407179be8573353dc0602f0f7583b69c9ecf703b324"} Mar 08 00:51:25.993332 master-0 kubenswrapper[23041]: I0308 00:51:25.993287 23041 generic.go:334] "Generic (PLEG): container finished" podID="465cdd3c-22eb-48eb-9d84-be612c4d7f7d" containerID="3b58efb0accb35f646da31f1a88629f3b3d4637c959d1d77a36a723f8cc30bb2" exitCode=0 Mar 08 00:51:25.993526 master-0 kubenswrapper[23041]: I0308 00:51:25.993365 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2cgqz" event={"ID":"465cdd3c-22eb-48eb-9d84-be612c4d7f7d","Type":"ContainerDied","Data":"3b58efb0accb35f646da31f1a88629f3b3d4637c959d1d77a36a723f8cc30bb2"} Mar 08 00:51:25.997122 master-0 kubenswrapper[23041]: I0308 00:51:25.997050 23041 generic.go:334] "Generic (PLEG): container finished" podID="3de682e4-56f1-4c92-870b-795df978e02a" containerID="c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69" exitCode=0 Mar 08 00:51:25.997381 master-0 kubenswrapper[23041]: I0308 00:51:25.997120 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" event={"ID":"3de682e4-56f1-4c92-870b-795df978e02a","Type":"ContainerDied","Data":"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69"} Mar 08 00:51:25.997381 master-0 kubenswrapper[23041]: I0308 00:51:25.997145 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" event={"ID":"3de682e4-56f1-4c92-870b-795df978e02a","Type":"ContainerDied","Data":"758161ac7ced28ce447e5853dce7077d40cbdbb06265d411f049d2f17ae29ee5"} Mar 08 00:51:25.997381 master-0 kubenswrapper[23041]: I0308 00:51:25.997163 23041 scope.go:117] "RemoveContainer" containerID="c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69" Mar 08 00:51:25.997381 master-0 kubenswrapper[23041]: I0308 00:51:25.997284 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6779d95cff-xxcrz" Mar 08 00:51:26.004082 master-0 kubenswrapper[23041]: I0308 00:51:26.003977 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e610ec98-66ae-412c-bab9-fab6413ef654","Type":"ContainerStarted","Data":"fa9227ea2884fa39ea153bc0c495216a912d5ade6b4f4c415d2cf56ba2bbbe26"} Mar 08 00:51:26.025395 master-0 kubenswrapper[23041]: I0308 00:51:26.025345 23041 scope.go:117] "RemoveContainer" containerID="c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f" Mar 08 00:51:26.099158 master-0 kubenswrapper[23041]: I0308 00:51:26.098816 23041 scope.go:117] "RemoveContainer" containerID="c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69" Mar 08 00:51:26.100041 master-0 kubenswrapper[23041]: E0308 00:51:26.099899 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69\": container with ID starting with c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69 not found: ID does not exist" containerID="c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69" Mar 08 00:51:26.100041 master-0 kubenswrapper[23041]: I0308 00:51:26.099967 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69"} err="failed to get container status \"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69\": rpc error: code = NotFound desc = could not find container \"c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69\": container with ID starting with c6f7fde292134214998ad98e3312d104ed5f475c30933d22d6c0de4b27eb5b69 not found: ID does not exist" Mar 08 00:51:26.100041 master-0 kubenswrapper[23041]: I0308 00:51:26.100002 23041 scope.go:117] "RemoveContainer" containerID="c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f" Mar 08 00:51:26.101038 master-0 kubenswrapper[23041]: E0308 00:51:26.100627 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f\": container with ID starting with c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f not found: ID does not exist" containerID="c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f" Mar 08 00:51:26.101038 master-0 kubenswrapper[23041]: I0308 00:51:26.100650 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f"} err="failed to get container status \"c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f\": rpc error: code = NotFound desc = could not find container \"c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f\": container with ID starting with c431462b70f7d269296ff672a8cfec4d36730a166f59986e03bfbf4861766b1f not found: ID does not exist" Mar 08 00:51:26.182802 master-0 kubenswrapper[23041]: I0308 00:51:26.182588 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:51:26.206383 master-0 kubenswrapper[23041]: I0308 00:51:26.206270 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6779d95cff-xxcrz"] Mar 08 00:51:26.831468 master-0 kubenswrapper[23041]: I0308 00:51:26.831339 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de682e4-56f1-4c92-870b-795df978e02a" path="/var/lib/kubelet/pods/3de682e4-56f1-4c92-870b-795df978e02a/volumes" Mar 08 00:51:27.011371 master-0 kubenswrapper[23041]: I0308 00:51:27.011324 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2cgqz" event={"ID":"465cdd3c-22eb-48eb-9d84-be612c4d7f7d","Type":"ContainerStarted","Data":"2f822476a7414abf5e01c46c430e003acfa5846e08bf43714cb1dea5eca82201"} Mar 08 00:51:28.031879 master-0 kubenswrapper[23041]: I0308 00:51:28.031696 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2cgqz" event={"ID":"465cdd3c-22eb-48eb-9d84-be612c4d7f7d","Type":"ContainerStarted","Data":"9da651aa61ea494f8001515920138cc8beef3560c68682c9b337efff03c10755"} Mar 08 00:51:28.032544 master-0 kubenswrapper[23041]: I0308 00:51:28.032433 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:28.032544 master-0 kubenswrapper[23041]: I0308 00:51:28.032533 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:28.102312 master-0 kubenswrapper[23041]: I0308 00:51:28.098492 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2cgqz" podStartSLOduration=16.186438488 podStartE2EDuration="25.096242264s" podCreationTimestamp="2026-03-08 00:51:03 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.634315622 +0000 UTC m=+1180.107152176" lastFinishedPulling="2026-03-08 00:51:23.544119398 +0000 UTC m=+1189.016955952" observedRunningTime="2026-03-08 00:51:28.092612105 +0000 UTC m=+1193.565448659" watchObservedRunningTime="2026-03-08 00:51:28.096242264 +0000 UTC m=+1193.569078828" Mar 08 00:51:30.010120 master-0 kubenswrapper[23041]: I0308 00:51:30.010040 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 00:51:31.077739 master-0 kubenswrapper[23041]: I0308 00:51:31.077635 23041 generic.go:334] "Generic (PLEG): container finished" podID="d63ce8bf-b1c8-44a8-92f2-298f046dc138" containerID="f0c47b170e74995a968b22646eac77ccb435cd019d758421001c12b127fc7f96" exitCode=0 Mar 08 00:51:31.078442 master-0 kubenswrapper[23041]: I0308 00:51:31.077730 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d63ce8bf-b1c8-44a8-92f2-298f046dc138","Type":"ContainerDied","Data":"f0c47b170e74995a968b22646eac77ccb435cd019d758421001c12b127fc7f96"} Mar 08 00:51:31.081172 master-0 kubenswrapper[23041]: I0308 00:51:31.081115 23041 generic.go:334] "Generic (PLEG): container finished" podID="b2a4deb4-466b-499d-a4c5-227ae1726fc9" containerID="1d5ca774c1202e929b763c30656834b22e983a8ab9418b7749d198b4840db018" exitCode=0 Mar 08 00:51:31.081336 master-0 kubenswrapper[23041]: I0308 00:51:31.081193 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2a4deb4-466b-499d-a4c5-227ae1726fc9","Type":"ContainerDied","Data":"1d5ca774c1202e929b763c30656834b22e983a8ab9418b7749d198b4840db018"} Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.340260 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: E0308 00:51:32.340739 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd4b2be-f3d5-4eec-8061-2055f3c4b001" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.340753 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd4b2be-f3d5-4eec-8061-2055f3c4b001" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: E0308 00:51:32.340782 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ab1df0-49b3-4091-89df-9a708f833de7" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.340791 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ab1df0-49b3-4091-89df-9a708f833de7" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: E0308 00:51:32.340806 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="dnsmasq-dns" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.340812 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="dnsmasq-dns" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: E0308 00:51:32.340859 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.340866 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.341070 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd4b2be-f3d5-4eec-8061-2055f3c4b001" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.341093 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ab1df0-49b3-4091-89df-9a708f833de7" containerName="init" Mar 08 00:51:32.341956 master-0 kubenswrapper[23041]: I0308 00:51:32.341102 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de682e4-56f1-4c92-870b-795df978e02a" containerName="dnsmasq-dns" Mar 08 00:51:32.343472 master-0 kubenswrapper[23041]: I0308 00:51:32.342174 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.369304 master-0 kubenswrapper[23041]: I0308 00:51:32.368464 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:32.460763 master-0 kubenswrapper[23041]: I0308 00:51:32.460668 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.461264 master-0 kubenswrapper[23041]: I0308 00:51:32.460803 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.461264 master-0 kubenswrapper[23041]: I0308 00:51:32.460859 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdwb\" (UniqueName: \"kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.562443 master-0 kubenswrapper[23041]: I0308 00:51:32.562382 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.562443 master-0 kubenswrapper[23041]: I0308 00:51:32.562439 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdwb\" (UniqueName: \"kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.562690 master-0 kubenswrapper[23041]: I0308 00:51:32.562576 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.563818 master-0 kubenswrapper[23041]: I0308 00:51:32.563409 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.563963 master-0 kubenswrapper[23041]: I0308 00:51:32.563941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.584059 master-0 kubenswrapper[23041]: I0308 00:51:32.584010 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdwb\" (UniqueName: \"kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb\") pod \"dnsmasq-dns-998757459-wdrgr\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:32.737220 master-0 kubenswrapper[23041]: I0308 00:51:32.736883 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:33.140286 master-0 kubenswrapper[23041]: I0308 00:51:33.139697 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7e56cd4-d944-40ae-b1e9-2ca3551c93f4","Type":"ContainerStarted","Data":"780fe29cd25e6a27039c12903b622acdf73ec41c288adcf60dbbe53aa435b1fb"} Mar 08 00:51:33.145711 master-0 kubenswrapper[23041]: I0308 00:51:33.143551 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be385568-7616-43bb-b89a-6478d7c995a4","Type":"ContainerStarted","Data":"7756e3c34a45623339259dd07fe6aeb46aaaa1cbb82681083fd07e48c11dcbb0"} Mar 08 00:51:33.149769 master-0 kubenswrapper[23041]: I0308 00:51:33.149712 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d63ce8bf-b1c8-44a8-92f2-298f046dc138","Type":"ContainerStarted","Data":"8230500643f71581870cf1f7ac810a978e8fa10463bdf1e903d1db703d10d98e"} Mar 08 00:51:33.159683 master-0 kubenswrapper[23041]: I0308 00:51:33.159620 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2a4deb4-466b-499d-a4c5-227ae1726fc9","Type":"ContainerStarted","Data":"8b9956cf197903e56ef2d9c1b87994e72dde5a254c05018bbab80c5eac5ecd10"} Mar 08 00:51:33.178943 master-0 kubenswrapper[23041]: I0308 00:51:33.177502 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.516618427000001 podStartE2EDuration="27.177479341s" podCreationTimestamp="2026-03-08 00:51:06 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.296929505 +0000 UTC m=+1179.769766059" lastFinishedPulling="2026-03-08 00:51:31.957790419 +0000 UTC m=+1197.430626973" observedRunningTime="2026-03-08 00:51:33.171269899 +0000 UTC m=+1198.644106453" watchObservedRunningTime="2026-03-08 00:51:33.177479341 +0000 UTC m=+1198.650315895" Mar 08 00:51:33.207013 master-0 kubenswrapper[23041]: I0308 00:51:33.206928 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.213298467 podStartE2EDuration="35.206905719s" podCreationTimestamp="2026-03-08 00:50:58 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.559926636 +0000 UTC m=+1180.032763190" lastFinishedPulling="2026-03-08 00:51:23.553533888 +0000 UTC m=+1189.026370442" observedRunningTime="2026-03-08 00:51:33.198697259 +0000 UTC m=+1198.671533813" watchObservedRunningTime="2026-03-08 00:51:33.206905719 +0000 UTC m=+1198.679742273" Mar 08 00:51:33.222048 master-0 kubenswrapper[23041]: W0308 00:51:33.221990 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8efffbf6_e0e9_4591_a140_e95ecb667ea0.slice/crio-7a48ee10f77d39a03ce68cba39be491359f393bd62b7c38bb5e983f8c2f7cfb0 WatchSource:0}: Error finding container 7a48ee10f77d39a03ce68cba39be491359f393bd62b7c38bb5e983f8c2f7cfb0: Status 404 returned error can't find the container with id 7a48ee10f77d39a03ce68cba39be491359f393bd62b7c38bb5e983f8c2f7cfb0 Mar 08 00:51:33.223002 master-0 kubenswrapper[23041]: I0308 00:51:33.222651 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:33.237053 master-0 kubenswrapper[23041]: I0308 00:51:33.236706 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.572462026 podStartE2EDuration="28.236685526s" podCreationTimestamp="2026-03-08 00:51:05 +0000 UTC" firstStartedPulling="2026-03-08 00:51:15.279510176 +0000 UTC m=+1180.752346730" lastFinishedPulling="2026-03-08 00:51:31.943733676 +0000 UTC m=+1197.416570230" observedRunningTime="2026-03-08 00:51:33.224092639 +0000 UTC m=+1198.696929203" watchObservedRunningTime="2026-03-08 00:51:33.236685526 +0000 UTC m=+1198.709522080" Mar 08 00:51:33.256925 master-0 kubenswrapper[23041]: I0308 00:51:33.256047 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.189495566 podStartE2EDuration="39.256024828s" podCreationTimestamp="2026-03-08 00:50:54 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.544279384 +0000 UTC m=+1180.017115938" lastFinishedPulling="2026-03-08 00:51:23.610808646 +0000 UTC m=+1189.083645200" observedRunningTime="2026-03-08 00:51:33.24747996 +0000 UTC m=+1198.720316514" watchObservedRunningTime="2026-03-08 00:51:33.256024828 +0000 UTC m=+1198.728861382" Mar 08 00:51:33.709831 master-0 kubenswrapper[23041]: I0308 00:51:33.709693 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:33.756728 master-0 kubenswrapper[23041]: I0308 00:51:33.756675 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:34.171262 master-0 kubenswrapper[23041]: I0308 00:51:34.171188 23041 generic.go:334] "Generic (PLEG): container finished" podID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerID="97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441" exitCode=0 Mar 08 00:51:34.171262 master-0 kubenswrapper[23041]: I0308 00:51:34.171234 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-998757459-wdrgr" event={"ID":"8efffbf6-e0e9-4591-a140-e95ecb667ea0","Type":"ContainerDied","Data":"97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441"} Mar 08 00:51:34.171572 master-0 kubenswrapper[23041]: I0308 00:51:34.171293 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-998757459-wdrgr" event={"ID":"8efffbf6-e0e9-4591-a140-e95ecb667ea0","Type":"ContainerStarted","Data":"7a48ee10f77d39a03ce68cba39be491359f393bd62b7c38bb5e983f8c2f7cfb0"} Mar 08 00:51:34.172221 master-0 kubenswrapper[23041]: I0308 00:51:34.171913 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:34.300224 master-0 kubenswrapper[23041]: I0308 00:51:34.293902 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 00:51:34.337227 master-0 kubenswrapper[23041]: I0308 00:51:34.331602 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 00:51:34.337227 master-0 kubenswrapper[23041]: I0308 00:51:34.331772 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 00:51:34.379818 master-0 kubenswrapper[23041]: I0308 00:51:34.369449 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 00:51:34.379818 master-0 kubenswrapper[23041]: I0308 00:51:34.369650 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 00:51:34.379818 master-0 kubenswrapper[23041]: I0308 00:51:34.369759 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474345 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-lock\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474454 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b569bfd0-b2b4-45a5-9a8b-96201e72fc57\" (UniqueName: \"kubernetes.io/csi/topolvm.io^465a85d8-7fdb-4f04-acda-9637f8941b8b\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474477 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-cache\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474500 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cd646-a717-4778-82a6-9471c70e13c5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474546 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.485224 master-0 kubenswrapper[23041]: I0308 00:51:34.474571 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchbh\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-kube-api-access-jchbh\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.584924 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cd646-a717-4778-82a6-9471c70e13c5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.585025 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.585058 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchbh\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-kube-api-access-jchbh\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.585155 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-lock\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.585241 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b569bfd0-b2b4-45a5-9a8b-96201e72fc57\" (UniqueName: \"kubernetes.io/csi/topolvm.io^465a85d8-7fdb-4f04-acda-9637f8941b8b\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: I0308 00:51:34.585259 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-cache\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: E0308 00:51:34.586851 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:34.586861 master-0 kubenswrapper[23041]: E0308 00:51:34.586876 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:34.587495 master-0 kubenswrapper[23041]: E0308 00:51:34.586947 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:51:35.086923195 +0000 UTC m=+1200.559759759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:34.591242 master-0 kubenswrapper[23041]: I0308 00:51:34.587660 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-cache\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.591242 master-0 kubenswrapper[23041]: I0308 00:51:34.588792 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/724cd646-a717-4778-82a6-9471c70e13c5-lock\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.591242 master-0 kubenswrapper[23041]: I0308 00:51:34.588821 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 00:51:34.591242 master-0 kubenswrapper[23041]: I0308 00:51:34.589588 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cd646-a717-4778-82a6-9471c70e13c5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.591814 master-0 kubenswrapper[23041]: I0308 00:51:34.591545 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:51:34.591814 master-0 kubenswrapper[23041]: I0308 00:51:34.591573 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b569bfd0-b2b4-45a5-9a8b-96201e72fc57\" (UniqueName: \"kubernetes.io/csi/topolvm.io^465a85d8-7fdb-4f04-acda-9637f8941b8b\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/af24eabedf5a88ab4373c7db9e85275354d68bcbe3c18478862dec28ea2a7aae/globalmount\"" pod="openstack/swift-storage-0" Mar 08 00:51:34.678293 master-0 kubenswrapper[23041]: I0308 00:51:34.678217 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchbh\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-kube-api-access-jchbh\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:34.927030 master-0 kubenswrapper[23041]: I0308 00:51:34.926907 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:34.994331 master-0 kubenswrapper[23041]: I0308 00:51:34.989410 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:35.051929 master-0 kubenswrapper[23041]: I0308 00:51:35.050945 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:35.106870 master-0 kubenswrapper[23041]: I0308 00:51:35.106795 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:35.107103 master-0 kubenswrapper[23041]: E0308 00:51:35.107034 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:35.107103 master-0 kubenswrapper[23041]: E0308 00:51:35.107051 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:35.107103 master-0 kubenswrapper[23041]: E0308 00:51:35.107095 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:51:36.107081214 +0000 UTC m=+1201.579917768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:35.135664 master-0 kubenswrapper[23041]: I0308 00:51:35.132488 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:35.135664 master-0 kubenswrapper[23041]: I0308 00:51:35.134216 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.144302 master-0 kubenswrapper[23041]: I0308 00:51:35.140760 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 00:51:35.164229 master-0 kubenswrapper[23041]: I0308 00:51:35.159299 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:35.187721 master-0 kubenswrapper[23041]: I0308 00:51:35.184810 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wtqbk"] Mar 08 00:51:35.195219 master-0 kubenswrapper[23041]: I0308 00:51:35.194490 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.203314 master-0 kubenswrapper[23041]: I0308 00:51:35.200490 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 00:51:35.219141 master-0 kubenswrapper[23041]: I0308 00:51:35.212294 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.219141 master-0 kubenswrapper[23041]: I0308 00:51:35.216146 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-998757459-wdrgr" event={"ID":"8efffbf6-e0e9-4591-a140-e95ecb667ea0","Type":"ContainerStarted","Data":"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73"} Mar 08 00:51:35.219141 master-0 kubenswrapper[23041]: I0308 00:51:35.216535 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:35.219141 master-0 kubenswrapper[23041]: I0308 00:51:35.216827 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:35.245226 master-0 kubenswrapper[23041]: I0308 00:51:35.219405 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.245226 master-0 kubenswrapper[23041]: I0308 00:51:35.219516 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgn7\" (UniqueName: \"kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.245226 master-0 kubenswrapper[23041]: I0308 00:51:35.219585 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.254216 master-0 kubenswrapper[23041]: I0308 00:51:35.249464 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wtqbk"] Mar 08 00:51:35.279001 master-0 kubenswrapper[23041]: I0308 00:51:35.278296 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 00:51:35.319474 master-0 kubenswrapper[23041]: I0308 00:51:35.319380 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-998757459-wdrgr" podStartSLOduration=3.319353777 podStartE2EDuration="3.319353777s" podCreationTimestamp="2026-03-08 00:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:35.247693197 +0000 UTC m=+1200.720529751" watchObservedRunningTime="2026-03-08 00:51:35.319353777 +0000 UTC m=+1200.792190341" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325286 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780b711c-875f-41d0-a4e3-c1afac254d8d-config\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325371 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325431 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8gdv\" (UniqueName: \"kubernetes.io/projected/780b711c-875f-41d0-a4e3-c1afac254d8d-kube-api-access-q8gdv\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325457 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovs-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325482 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgn7\" (UniqueName: \"kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325528 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325572 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325630 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-combined-ca-bundle\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.325784 master-0 kubenswrapper[23041]: I0308 00:51:35.325727 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.333920 master-0 kubenswrapper[23041]: I0308 00:51:35.333701 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.334875 master-0 kubenswrapper[23041]: I0308 00:51:35.334377 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.334875 master-0 kubenswrapper[23041]: I0308 00:51:35.334642 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.338380 master-0 kubenswrapper[23041]: I0308 00:51:35.336244 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovn-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.379857 master-0 kubenswrapper[23041]: I0308 00:51:35.379431 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgn7\" (UniqueName: \"kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7\") pod \"dnsmasq-dns-795f757f69-cvqcm\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440482 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440556 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-combined-ca-bundle\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440656 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovn-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440702 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780b711c-875f-41d0-a4e3-c1afac254d8d-config\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440747 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovs-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.440967 master-0 kubenswrapper[23041]: I0308 00:51:35.440761 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8gdv\" (UniqueName: \"kubernetes.io/projected/780b711c-875f-41d0-a4e3-c1afac254d8d-kube-api-access-q8gdv\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.443416 master-0 kubenswrapper[23041]: I0308 00:51:35.441628 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovn-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.445143 master-0 kubenswrapper[23041]: I0308 00:51:35.444436 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/780b711c-875f-41d0-a4e3-c1afac254d8d-ovs-rundir\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.450840 master-0 kubenswrapper[23041]: I0308 00:51:35.449568 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 00:51:35.451718 master-0 kubenswrapper[23041]: I0308 00:51:35.451644 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-combined-ca-bundle\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.458048 master-0 kubenswrapper[23041]: I0308 00:51:35.457993 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780b711c-875f-41d0-a4e3-c1afac254d8d-config\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.470976 master-0 kubenswrapper[23041]: I0308 00:51:35.470733 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8gdv\" (UniqueName: \"kubernetes.io/projected/780b711c-875f-41d0-a4e3-c1afac254d8d-kube-api-access-q8gdv\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.473735 master-0 kubenswrapper[23041]: I0308 00:51:35.473680 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/780b711c-875f-41d0-a4e3-c1afac254d8d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wtqbk\" (UID: \"780b711c-875f-41d0-a4e3-c1afac254d8d\") " pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.500464 master-0 kubenswrapper[23041]: I0308 00:51:35.500391 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:35.501274 master-0 kubenswrapper[23041]: I0308 00:51:35.500788 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:35.526781 master-0 kubenswrapper[23041]: I0308 00:51:35.523634 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wtqbk" Mar 08 00:51:35.566694 master-0 kubenswrapper[23041]: I0308 00:51:35.565346 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:51:35.567027 master-0 kubenswrapper[23041]: I0308 00:51:35.566977 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.585159 master-0 kubenswrapper[23041]: I0308 00:51:35.583589 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 00:51:35.648654 master-0 kubenswrapper[23041]: I0308 00:51:35.648576 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:51:35.651496 master-0 kubenswrapper[23041]: I0308 00:51:35.651469 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tn2j\" (UniqueName: \"kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.651676 master-0 kubenswrapper[23041]: I0308 00:51:35.651659 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.651777 master-0 kubenswrapper[23041]: I0308 00:51:35.651764 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.651909 master-0 kubenswrapper[23041]: I0308 00:51:35.651874 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.651997 master-0 kubenswrapper[23041]: I0308 00:51:35.651984 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.760039 master-0 kubenswrapper[23041]: I0308 00:51:35.759355 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.760039 master-0 kubenswrapper[23041]: I0308 00:51:35.759737 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tn2j\" (UniqueName: \"kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.760039 master-0 kubenswrapper[23041]: I0308 00:51:35.759969 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.760993 master-0 kubenswrapper[23041]: I0308 00:51:35.760959 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.761137 master-0 kubenswrapper[23041]: I0308 00:51:35.761111 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.764168 master-0 kubenswrapper[23041]: I0308 00:51:35.763760 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.766243 master-0 kubenswrapper[23041]: I0308 00:51:35.766189 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.769026 master-0 kubenswrapper[23041]: I0308 00:51:35.769003 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.770123 master-0 kubenswrapper[23041]: I0308 00:51:35.770101 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.786737 master-0 kubenswrapper[23041]: I0308 00:51:35.785787 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tn2j\" (UniqueName: \"kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j\") pod \"dnsmasq-dns-6b9cd4dcf7-dmhrm\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.867275 master-0 kubenswrapper[23041]: I0308 00:51:35.864986 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 00:51:35.880693 master-0 kubenswrapper[23041]: I0308 00:51:35.876530 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 00:51:35.890046 master-0 kubenswrapper[23041]: I0308 00:51:35.885291 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 00:51:35.890046 master-0 kubenswrapper[23041]: I0308 00:51:35.885615 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 00:51:35.890046 master-0 kubenswrapper[23041]: I0308 00:51:35.885767 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 00:51:35.901417 master-0 kubenswrapper[23041]: I0308 00:51:35.899237 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 00:51:35.962658 master-0 kubenswrapper[23041]: I0308 00:51:35.962306 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.972929 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgv8v\" (UniqueName: \"kubernetes.io/projected/8e765f28-c1ac-485a-9df9-fe45e6f364ae-kube-api-access-dgv8v\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973042 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-config\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973078 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973119 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973139 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973251 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-scripts\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:35.977555 master-0 kubenswrapper[23041]: I0308 00:51:35.973278 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074409 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074493 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgv8v\" (UniqueName: \"kubernetes.io/projected/8e765f28-c1ac-485a-9df9-fe45e6f364ae-kube-api-access-dgv8v\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074563 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-config\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074592 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074627 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074647 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075609 master-0 kubenswrapper[23041]: I0308 00:51:36.074726 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-scripts\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.075937 master-0 kubenswrapper[23041]: I0308 00:51:36.075747 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-scripts\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.076487 master-0 kubenswrapper[23041]: I0308 00:51:36.076035 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.077372 master-0 kubenswrapper[23041]: I0308 00:51:36.076884 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e765f28-c1ac-485a-9df9-fe45e6f364ae-config\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.084698 master-0 kubenswrapper[23041]: I0308 00:51:36.084626 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.089847 master-0 kubenswrapper[23041]: I0308 00:51:36.089816 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.097852 master-0 kubenswrapper[23041]: I0308 00:51:36.094070 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e765f28-c1ac-485a-9df9-fe45e6f364ae-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.104491 master-0 kubenswrapper[23041]: I0308 00:51:36.103299 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgv8v\" (UniqueName: \"kubernetes.io/projected/8e765f28-c1ac-485a-9df9-fe45e6f364ae-kube-api-access-dgv8v\") pod \"ovn-northd-0\" (UID: \"8e765f28-c1ac-485a-9df9-fe45e6f364ae\") " pod="openstack/ovn-northd-0" Mar 08 00:51:36.140121 master-0 kubenswrapper[23041]: I0308 00:51:36.139406 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:36.176773 master-0 kubenswrapper[23041]: I0308 00:51:36.176723 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:36.176991 master-0 kubenswrapper[23041]: E0308 00:51:36.176961 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:36.176991 master-0 kubenswrapper[23041]: E0308 00:51:36.176977 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:36.177299 master-0 kubenswrapper[23041]: E0308 00:51:36.177019 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:51:38.177005078 +0000 UTC m=+1203.649841632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:36.230999 master-0 kubenswrapper[23041]: I0308 00:51:36.230909 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 00:51:36.238393 master-0 kubenswrapper[23041]: I0308 00:51:36.238326 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" event={"ID":"784b83fc-5e3f-49e4-b610-032444dec900","Type":"ContainerStarted","Data":"8b4f9f61eec12df17c51ff0cad997028b70cb2485012e51119c1e5293bc57e3c"} Mar 08 00:51:36.238947 master-0 kubenswrapper[23041]: I0308 00:51:36.238921 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-998757459-wdrgr" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="dnsmasq-dns" containerID="cri-o://0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73" gracePeriod=10 Mar 08 00:51:36.239812 master-0 kubenswrapper[23041]: I0308 00:51:36.239785 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b569bfd0-b2b4-45a5-9a8b-96201e72fc57\" (UniqueName: \"kubernetes.io/csi/topolvm.io^465a85d8-7fdb-4f04-acda-9637f8941b8b\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:36.258280 master-0 kubenswrapper[23041]: I0308 00:51:36.258001 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wtqbk"] Mar 08 00:51:36.471077 master-0 kubenswrapper[23041]: I0308 00:51:36.471003 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:51:36.491509 master-0 kubenswrapper[23041]: I0308 00:51:36.488594 23041 trace.go:236] Trace[289700210]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (08-Mar-2026 00:51:35.351) (total time: 1137ms): Mar 08 00:51:36.491509 master-0 kubenswrapper[23041]: Trace[289700210]: [1.137292949s] [1.137292949s] END Mar 08 00:51:36.658741 master-0 kubenswrapper[23041]: I0308 00:51:36.652862 23041 trace.go:236] Trace[244953907]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (08-Mar-2026 00:51:35.346) (total time: 1306ms): Mar 08 00:51:36.658741 master-0 kubenswrapper[23041]: Trace[244953907]: [1.306364737s] [1.306364737s] END Mar 08 00:51:36.748989 master-0 kubenswrapper[23041]: I0308 00:51:36.748881 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 00:51:36.847857 master-0 kubenswrapper[23041]: I0308 00:51:36.847661 23041 trace.go:236] Trace[1743235137]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (08-Mar-2026 00:51:35.349) (total time: 1498ms): Mar 08 00:51:36.847857 master-0 kubenswrapper[23041]: Trace[1743235137]: [1.498098149s] [1.498098149s] END Mar 08 00:51:36.908794 master-0 kubenswrapper[23041]: I0308 00:51:36.908738 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 00:51:36.909070 master-0 kubenswrapper[23041]: I0308 00:51:36.909058 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 00:51:36.916978 master-0 kubenswrapper[23041]: I0308 00:51:36.916935 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gl66q"] Mar 08 00:51:36.920692 master-0 kubenswrapper[23041]: I0308 00:51:36.918177 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:36.924118 master-0 kubenswrapper[23041]: I0308 00:51:36.924093 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 00:51:36.924541 master-0 kubenswrapper[23041]: I0308 00:51:36.924307 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 00:51:36.924541 master-0 kubenswrapper[23041]: I0308 00:51:36.924421 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 00:51:36.949232 master-0 kubenswrapper[23041]: I0308 00:51:36.947359 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gl66q"] Mar 08 00:51:37.001623 master-0 kubenswrapper[23041]: I0308 00:51:37.001350 23041 trace.go:236] Trace[380128401]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (08-Mar-2026 00:51:35.344) (total time: 1656ms): Mar 08 00:51:37.001623 master-0 kubenswrapper[23041]: Trace[380128401]: [1.656811883s] [1.656811883s] END Mar 08 00:51:37.050301 master-0 kubenswrapper[23041]: I0308 00:51:37.050262 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:37.097275 master-0 kubenswrapper[23041]: I0308 00:51:37.097119 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097564 master-0 kubenswrapper[23041]: I0308 00:51:37.097484 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097564 master-0 kubenswrapper[23041]: I0308 00:51:37.097539 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097687 master-0 kubenswrapper[23041]: I0308 00:51:37.097576 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097739 master-0 kubenswrapper[23041]: I0308 00:51:37.097720 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097784 master-0 kubenswrapper[23041]: I0308 00:51:37.097753 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.097897 master-0 kubenswrapper[23041]: I0308 00:51:37.097864 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptt9\" (UniqueName: \"kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.158956 master-0 kubenswrapper[23041]: I0308 00:51:37.158857 23041 trace.go:236] Trace[994857057]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (08-Mar-2026 00:51:35.350) (total time: 1808ms): Mar 08 00:51:37.158956 master-0 kubenswrapper[23041]: Trace[994857057]: [1.808074007s] [1.808074007s] END Mar 08 00:51:37.199949 master-0 kubenswrapper[23041]: I0308 00:51:37.199855 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc\") pod \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " Mar 08 00:51:37.200179 master-0 kubenswrapper[23041]: I0308 00:51:37.200131 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config\") pod \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " Mar 08 00:51:37.200242 master-0 kubenswrapper[23041]: I0308 00:51:37.200208 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdwb\" (UniqueName: \"kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb\") pod \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\" (UID: \"8efffbf6-e0e9-4591-a140-e95ecb667ea0\") " Mar 08 00:51:37.200521 master-0 kubenswrapper[23041]: I0308 00:51:37.200490 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptt9\" (UniqueName: \"kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200628 master-0 kubenswrapper[23041]: I0308 00:51:37.200569 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200730 master-0 kubenswrapper[23041]: I0308 00:51:37.200710 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200794 master-0 kubenswrapper[23041]: I0308 00:51:37.200735 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200794 master-0 kubenswrapper[23041]: I0308 00:51:37.200755 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200794 master-0 kubenswrapper[23041]: I0308 00:51:37.200793 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.200885 master-0 kubenswrapper[23041]: I0308 00:51:37.200810 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.201497 master-0 kubenswrapper[23041]: I0308 00:51:37.201299 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.206236 master-0 kubenswrapper[23041]: I0308 00:51:37.202281 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.206236 master-0 kubenswrapper[23041]: I0308 00:51:37.202467 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.206599 master-0 kubenswrapper[23041]: I0308 00:51:37.206562 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb" (OuterVolumeSpecName: "kube-api-access-vjdwb") pod "8efffbf6-e0e9-4591-a140-e95ecb667ea0" (UID: "8efffbf6-e0e9-4591-a140-e95ecb667ea0"). InnerVolumeSpecName "kube-api-access-vjdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:37.214068 master-0 kubenswrapper[23041]: I0308 00:51:37.214010 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.214261 master-0 kubenswrapper[23041]: I0308 00:51:37.214076 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.214261 master-0 kubenswrapper[23041]: I0308 00:51:37.214148 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.216783 master-0 kubenswrapper[23041]: I0308 00:51:37.216759 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptt9\" (UniqueName: \"kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9\") pod \"swift-ring-rebalance-gl66q\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.252442 master-0 kubenswrapper[23041]: I0308 00:51:37.252386 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:37.293436 master-0 kubenswrapper[23041]: I0308 00:51:37.293321 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8efffbf6-e0e9-4591-a140-e95ecb667ea0" (UID: "8efffbf6-e0e9-4591-a140-e95ecb667ea0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:37.303697 master-0 kubenswrapper[23041]: I0308 00:51:37.303267 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:37.303697 master-0 kubenswrapper[23041]: I0308 00:51:37.303315 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdwb\" (UniqueName: \"kubernetes.io/projected/8efffbf6-e0e9-4591-a140-e95ecb667ea0-kube-api-access-vjdwb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:37.305286 master-0 kubenswrapper[23041]: I0308 00:51:37.305179 23041 generic.go:334] "Generic (PLEG): container finished" podID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerID="eb10640cf6e20d214aac1741c7a761383b3661d62b126d734de6010883ff4d31" exitCode=0 Mar 08 00:51:37.305352 master-0 kubenswrapper[23041]: I0308 00:51:37.305319 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" event={"ID":"0afbd7e3-8d43-4de7-8016-5183747a3db1","Type":"ContainerDied","Data":"eb10640cf6e20d214aac1741c7a761383b3661d62b126d734de6010883ff4d31"} Mar 08 00:51:37.305398 master-0 kubenswrapper[23041]: I0308 00:51:37.305357 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" event={"ID":"0afbd7e3-8d43-4de7-8016-5183747a3db1","Type":"ContainerStarted","Data":"be52c5deecf3088082976fa8099a7287ee459e33921a5df3b63a93d762851fa5"} Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.311856 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config" (OuterVolumeSpecName: "config") pod "8efffbf6-e0e9-4591-a140-e95ecb667ea0" (UID: "8efffbf6-e0e9-4591-a140-e95ecb667ea0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.312916 23041 generic.go:334] "Generic (PLEG): container finished" podID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerID="0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73" exitCode=0 Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.313072 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-998757459-wdrgr" Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.313065 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-998757459-wdrgr" event={"ID":"8efffbf6-e0e9-4591-a140-e95ecb667ea0","Type":"ContainerDied","Data":"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73"} Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.314175 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-998757459-wdrgr" event={"ID":"8efffbf6-e0e9-4591-a140-e95ecb667ea0","Type":"ContainerDied","Data":"7a48ee10f77d39a03ce68cba39be491359f393bd62b7c38bb5e983f8c2f7cfb0"} Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.314207 23041 scope.go:117] "RemoveContainer" containerID="0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73" Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.316154 23041 generic.go:334] "Generic (PLEG): container finished" podID="784b83fc-5e3f-49e4-b610-032444dec900" containerID="741362874b051f328014993f178f33a413c1d4ac938d6950b8864d414bf605e1" exitCode=0 Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.316291 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" event={"ID":"784b83fc-5e3f-49e4-b610-032444dec900","Type":"ContainerDied","Data":"741362874b051f328014993f178f33a413c1d4ac938d6950b8864d414bf605e1"} Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.319411 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wtqbk" event={"ID":"780b711c-875f-41d0-a4e3-c1afac254d8d","Type":"ContainerStarted","Data":"7ed780136532ac0df3af6c4e8210194b2b9a21d0f5fb5eab786a7f02f19e5cf1"} Mar 08 00:51:37.320708 master-0 kubenswrapper[23041]: I0308 00:51:37.319438 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wtqbk" event={"ID":"780b711c-875f-41d0-a4e3-c1afac254d8d","Type":"ContainerStarted","Data":"e3788c562c32d6e6a7323823bb5f4745b340b86d2fda0e9c9dc385c4d6158188"} Mar 08 00:51:37.334119 master-0 kubenswrapper[23041]: I0308 00:51:37.333791 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e765f28-c1ac-485a-9df9-fe45e6f364ae","Type":"ContainerStarted","Data":"6de6b767488295537f1f15669043e6f9903daa866544b72de825fdc4e0bd802f"} Mar 08 00:51:37.369595 master-0 kubenswrapper[23041]: I0308 00:51:37.369340 23041 scope.go:117] "RemoveContainer" containerID="97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441" Mar 08 00:51:37.409460 master-0 kubenswrapper[23041]: I0308 00:51:37.409407 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffbf6-e0e9-4591-a140-e95ecb667ea0-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:37.410908 master-0 kubenswrapper[23041]: I0308 00:51:37.410741 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wtqbk" podStartSLOduration=2.410721761 podStartE2EDuration="2.410721761s" podCreationTimestamp="2026-03-08 00:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:37.395766326 +0000 UTC m=+1202.868602890" watchObservedRunningTime="2026-03-08 00:51:37.410721761 +0000 UTC m=+1202.883558305" Mar 08 00:51:37.455308 master-0 kubenswrapper[23041]: I0308 00:51:37.452291 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:37.461585 master-0 kubenswrapper[23041]: I0308 00:51:37.460057 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-998757459-wdrgr"] Mar 08 00:51:37.510277 master-0 kubenswrapper[23041]: I0308 00:51:37.499583 23041 scope.go:117] "RemoveContainer" containerID="0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73" Mar 08 00:51:37.510277 master-0 kubenswrapper[23041]: E0308 00:51:37.505901 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73\": container with ID starting with 0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73 not found: ID does not exist" containerID="0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73" Mar 08 00:51:37.510277 master-0 kubenswrapper[23041]: I0308 00:51:37.505952 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73"} err="failed to get container status \"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73\": rpc error: code = NotFound desc = could not find container \"0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73\": container with ID starting with 0c37ea88548eae571547d4f35ba721b6718649d4e1bd806fe6406bcf6ab65e73 not found: ID does not exist" Mar 08 00:51:37.510277 master-0 kubenswrapper[23041]: I0308 00:51:37.505979 23041 scope.go:117] "RemoveContainer" containerID="97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441" Mar 08 00:51:37.514077 master-0 kubenswrapper[23041]: E0308 00:51:37.514003 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441\": container with ID starting with 97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441 not found: ID does not exist" containerID="97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441" Mar 08 00:51:37.514153 master-0 kubenswrapper[23041]: I0308 00:51:37.514078 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441"} err="failed to get container status \"97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441\": rpc error: code = NotFound desc = could not find container \"97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441\": container with ID starting with 97c4b634f6daab3f1bf90f6e274a93af6f6d56694917d892a308cf1b487fd441 not found: ID does not exist" Mar 08 00:51:37.724302 master-0 kubenswrapper[23041]: I0308 00:51:37.723857 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:37.724302 master-0 kubenswrapper[23041]: I0308 00:51:37.723933 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:37.825891 master-0 kubenswrapper[23041]: I0308 00:51:37.825809 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gl66q"] Mar 08 00:51:38.047662 master-0 kubenswrapper[23041]: W0308 00:51:38.045754 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31b5c8ab_cbab_4c6e_a021_3ba2895102bb.slice/crio-723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498 WatchSource:0}: Error finding container 723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498: Status 404 returned error can't find the container with id 723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498 Mar 08 00:51:38.180268 master-0 kubenswrapper[23041]: I0308 00:51:38.179590 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:38.227223 master-0 kubenswrapper[23041]: I0308 00:51:38.227148 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:38.227422 master-0 kubenswrapper[23041]: E0308 00:51:38.227335 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:38.227422 master-0 kubenswrapper[23041]: E0308 00:51:38.227358 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:38.227422 master-0 kubenswrapper[23041]: E0308 00:51:38.227410 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:51:42.227394672 +0000 UTC m=+1207.700231226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:38.328531 master-0 kubenswrapper[23041]: I0308 00:51:38.328475 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc\") pod \"784b83fc-5e3f-49e4-b610-032444dec900\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " Mar 08 00:51:38.328700 master-0 kubenswrapper[23041]: I0308 00:51:38.328553 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config\") pod \"784b83fc-5e3f-49e4-b610-032444dec900\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " Mar 08 00:51:38.328700 master-0 kubenswrapper[23041]: I0308 00:51:38.328571 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb\") pod \"784b83fc-5e3f-49e4-b610-032444dec900\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " Mar 08 00:51:38.328792 master-0 kubenswrapper[23041]: I0308 00:51:38.328713 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgn7\" (UniqueName: \"kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7\") pod \"784b83fc-5e3f-49e4-b610-032444dec900\" (UID: \"784b83fc-5e3f-49e4-b610-032444dec900\") " Mar 08 00:51:38.333981 master-0 kubenswrapper[23041]: I0308 00:51:38.333933 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7" (OuterVolumeSpecName: "kube-api-access-jqgn7") pod "784b83fc-5e3f-49e4-b610-032444dec900" (UID: "784b83fc-5e3f-49e4-b610-032444dec900"). InnerVolumeSpecName "kube-api-access-jqgn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:38.343569 master-0 kubenswrapper[23041]: I0308 00:51:38.338842 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" event={"ID":"784b83fc-5e3f-49e4-b610-032444dec900","Type":"ContainerDied","Data":"8b4f9f61eec12df17c51ff0cad997028b70cb2485012e51119c1e5293bc57e3c"} Mar 08 00:51:38.343569 master-0 kubenswrapper[23041]: I0308 00:51:38.338894 23041 scope.go:117] "RemoveContainer" containerID="741362874b051f328014993f178f33a413c1d4ac938d6950b8864d414bf605e1" Mar 08 00:51:38.343569 master-0 kubenswrapper[23041]: I0308 00:51:38.338988 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f757f69-cvqcm" Mar 08 00:51:38.343569 master-0 kubenswrapper[23041]: I0308 00:51:38.343486 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gl66q" event={"ID":"31b5c8ab-cbab-4c6e-a021-3ba2895102bb","Type":"ContainerStarted","Data":"723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498"} Mar 08 00:51:38.346546 master-0 kubenswrapper[23041]: I0308 00:51:38.346488 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e765f28-c1ac-485a-9df9-fe45e6f364ae","Type":"ContainerStarted","Data":"f67bf953d75540f9bfed036b3f073b4cdad67f49400e298fbb3fa216454525f4"} Mar 08 00:51:38.362446 master-0 kubenswrapper[23041]: I0308 00:51:38.353184 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" event={"ID":"0afbd7e3-8d43-4de7-8016-5183747a3db1","Type":"ContainerStarted","Data":"3f5c4c3c294763ff6fa374ef92214194edd69d216dd9e1dc398c649e28f084be"} Mar 08 00:51:38.362446 master-0 kubenswrapper[23041]: I0308 00:51:38.353368 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:38.371260 master-0 kubenswrapper[23041]: I0308 00:51:38.370136 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config" (OuterVolumeSpecName: "config") pod "784b83fc-5e3f-49e4-b610-032444dec900" (UID: "784b83fc-5e3f-49e4-b610-032444dec900"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:38.374949 master-0 kubenswrapper[23041]: I0308 00:51:38.374898 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "784b83fc-5e3f-49e4-b610-032444dec900" (UID: "784b83fc-5e3f-49e4-b610-032444dec900"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:38.381570 master-0 kubenswrapper[23041]: I0308 00:51:38.380865 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" podStartSLOduration=3.380845558 podStartE2EDuration="3.380845558s" podCreationTimestamp="2026-03-08 00:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:38.378810208 +0000 UTC m=+1203.851646772" watchObservedRunningTime="2026-03-08 00:51:38.380845558 +0000 UTC m=+1203.853682112" Mar 08 00:51:38.381806 master-0 kubenswrapper[23041]: I0308 00:51:38.381570 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "784b83fc-5e3f-49e4-b610-032444dec900" (UID: "784b83fc-5e3f-49e4-b610-032444dec900"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:38.431912 master-0 kubenswrapper[23041]: I0308 00:51:38.431835 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgn7\" (UniqueName: \"kubernetes.io/projected/784b83fc-5e3f-49e4-b610-032444dec900-kube-api-access-jqgn7\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:38.431912 master-0 kubenswrapper[23041]: I0308 00:51:38.431889 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:38.431912 master-0 kubenswrapper[23041]: I0308 00:51:38.431903 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:38.431912 master-0 kubenswrapper[23041]: I0308 00:51:38.431914 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784b83fc-5e3f-49e4-b610-032444dec900-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:38.717229 master-0 kubenswrapper[23041]: I0308 00:51:38.715072 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:38.720820 master-0 kubenswrapper[23041]: I0308 00:51:38.720588 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f757f69-cvqcm"] Mar 08 00:51:38.825626 master-0 kubenswrapper[23041]: I0308 00:51:38.825574 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784b83fc-5e3f-49e4-b610-032444dec900" path="/var/lib/kubelet/pods/784b83fc-5e3f-49e4-b610-032444dec900/volumes" Mar 08 00:51:38.826407 master-0 kubenswrapper[23041]: I0308 00:51:38.826386 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" path="/var/lib/kubelet/pods/8efffbf6-e0e9-4591-a140-e95ecb667ea0/volumes" Mar 08 00:51:39.378988 master-0 kubenswrapper[23041]: I0308 00:51:39.378924 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e765f28-c1ac-485a-9df9-fe45e6f364ae","Type":"ContainerStarted","Data":"d2948834f916d2df999a4f91ae98a5c07e9f102b5b845e55a6601a0a24262ae3"} Mar 08 00:51:39.378988 master-0 kubenswrapper[23041]: I0308 00:51:39.378995 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 00:51:39.620900 master-0 kubenswrapper[23041]: I0308 00:51:39.620840 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 00:51:39.691537 master-0 kubenswrapper[23041]: I0308 00:51:39.691401 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 00:51:39.856422 master-0 kubenswrapper[23041]: I0308 00:51:39.852738 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.596183415 podStartE2EDuration="4.852721816s" podCreationTimestamp="2026-03-08 00:51:35 +0000 UTC" firstStartedPulling="2026-03-08 00:51:36.836725886 +0000 UTC m=+1202.309562430" lastFinishedPulling="2026-03-08 00:51:38.093264277 +0000 UTC m=+1203.566100831" observedRunningTime="2026-03-08 00:51:39.849835195 +0000 UTC m=+1205.322671779" watchObservedRunningTime="2026-03-08 00:51:39.852721816 +0000 UTC m=+1205.325558370" Mar 08 00:51:40.294253 master-0 kubenswrapper[23041]: I0308 00:51:40.294116 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d80-account-create-update-dxjzn"] Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: E0308 00:51:40.294638 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="dnsmasq-dns" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: I0308 00:51:40.294658 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="dnsmasq-dns" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: E0308 00:51:40.294693 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784b83fc-5e3f-49e4-b610-032444dec900" containerName="init" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: I0308 00:51:40.294700 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="784b83fc-5e3f-49e4-b610-032444dec900" containerName="init" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: E0308 00:51:40.294725 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="init" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: I0308 00:51:40.294733 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="init" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: I0308 00:51:40.294948 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="784b83fc-5e3f-49e4-b610-032444dec900" containerName="init" Mar 08 00:51:40.295049 master-0 kubenswrapper[23041]: I0308 00:51:40.294972 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efffbf6-e0e9-4591-a140-e95ecb667ea0" containerName="dnsmasq-dns" Mar 08 00:51:40.307308 master-0 kubenswrapper[23041]: I0308 00:51:40.304644 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.307308 master-0 kubenswrapper[23041]: I0308 00:51:40.307155 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 00:51:40.323654 master-0 kubenswrapper[23041]: I0308 00:51:40.321315 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bbghv"] Mar 08 00:51:40.323654 master-0 kubenswrapper[23041]: I0308 00:51:40.323586 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.333380 master-0 kubenswrapper[23041]: I0308 00:51:40.332934 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d80-account-create-update-dxjzn"] Mar 08 00:51:40.354895 master-0 kubenswrapper[23041]: I0308 00:51:40.354856 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bbghv"] Mar 08 00:51:40.378644 master-0 kubenswrapper[23041]: I0308 00:51:40.378470 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpj4\" (UniqueName: \"kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.378964 master-0 kubenswrapper[23041]: I0308 00:51:40.378728 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.379031 master-0 kubenswrapper[23041]: I0308 00:51:40.378979 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.379031 master-0 kubenswrapper[23041]: I0308 00:51:40.379021 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6znn\" (UniqueName: \"kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.482632 master-0 kubenswrapper[23041]: I0308 00:51:40.482175 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.482632 master-0 kubenswrapper[23041]: I0308 00:51:40.482617 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6znn\" (UniqueName: \"kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.483085 master-0 kubenswrapper[23041]: I0308 00:51:40.483050 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpj4\" (UniqueName: \"kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.483128 master-0 kubenswrapper[23041]: I0308 00:51:40.483112 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.483420 master-0 kubenswrapper[23041]: I0308 00:51:40.483328 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.484236 master-0 kubenswrapper[23041]: I0308 00:51:40.484187 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.496954 master-0 kubenswrapper[23041]: I0308 00:51:40.496921 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6znn\" (UniqueName: \"kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn\") pod \"placement-8d80-account-create-update-dxjzn\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.499477 master-0 kubenswrapper[23041]: I0308 00:51:40.499403 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpj4\" (UniqueName: \"kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4\") pod \"placement-db-create-bbghv\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " pod="openstack/placement-db-create-bbghv" Mar 08 00:51:40.647558 master-0 kubenswrapper[23041]: I0308 00:51:40.647424 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:40.653456 master-0 kubenswrapper[23041]: I0308 00:51:40.653398 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bbghv" Mar 08 00:51:41.853334 master-0 kubenswrapper[23041]: I0308 00:51:41.852478 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bbghv"] Mar 08 00:51:41.875094 master-0 kubenswrapper[23041]: I0308 00:51:41.874998 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d80-account-create-update-dxjzn"] Mar 08 00:51:42.082033 master-0 kubenswrapper[23041]: I0308 00:51:42.081845 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:42.083276 master-0 kubenswrapper[23041]: I0308 00:51:42.083240 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4n48d"] Mar 08 00:51:42.086247 master-0 kubenswrapper[23041]: I0308 00:51:42.084805 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.130435 master-0 kubenswrapper[23041]: I0308 00:51:42.130253 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4n48d"] Mar 08 00:51:42.156610 master-0 kubenswrapper[23041]: I0308 00:51:42.153703 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.156610 master-0 kubenswrapper[23041]: I0308 00:51:42.153967 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9plbg\" (UniqueName: \"kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.202428 master-0 kubenswrapper[23041]: I0308 00:51:42.202364 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 00:51:42.209974 master-0 kubenswrapper[23041]: I0308 00:51:42.209936 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-df7d-account-create-update-pbhhl"] Mar 08 00:51:42.212015 master-0 kubenswrapper[23041]: I0308 00:51:42.211985 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.214222 master-0 kubenswrapper[23041]: I0308 00:51:42.214179 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 00:51:42.223606 master-0 kubenswrapper[23041]: I0308 00:51:42.223543 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-df7d-account-create-update-pbhhl"] Mar 08 00:51:42.257783 master-0 kubenswrapper[23041]: I0308 00:51:42.257730 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.259473 master-0 kubenswrapper[23041]: I0308 00:51:42.257965 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9plbg\" (UniqueName: \"kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.259473 master-0 kubenswrapper[23041]: I0308 00:51:42.257997 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:42.259686 master-0 kubenswrapper[23041]: E0308 00:51:42.259530 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:42.259686 master-0 kubenswrapper[23041]: E0308 00:51:42.259547 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:42.259686 master-0 kubenswrapper[23041]: E0308 00:51:42.259584 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:51:50.259569572 +0000 UTC m=+1215.732406126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:42.261002 master-0 kubenswrapper[23041]: I0308 00:51:42.260940 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.305257 master-0 kubenswrapper[23041]: I0308 00:51:42.295131 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plbg\" (UniqueName: \"kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg\") pod \"glance-db-create-4n48d\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.362798 master-0 kubenswrapper[23041]: I0308 00:51:42.362702 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2w4\" (UniqueName: \"kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.363186 master-0 kubenswrapper[23041]: I0308 00:51:42.363112 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.450996 master-0 kubenswrapper[23041]: I0308 00:51:42.450824 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4n48d" Mar 08 00:51:42.465953 master-0 kubenswrapper[23041]: I0308 00:51:42.465884 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2w4\" (UniqueName: \"kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.466157 master-0 kubenswrapper[23041]: I0308 00:51:42.466009 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.466932 master-0 kubenswrapper[23041]: I0308 00:51:42.466900 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.496482 master-0 kubenswrapper[23041]: I0308 00:51:42.494971 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2w4\" (UniqueName: \"kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4\") pod \"glance-df7d-account-create-update-pbhhl\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:42.549134 master-0 kubenswrapper[23041]: I0308 00:51:42.549068 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:43.865736 master-0 kubenswrapper[23041]: W0308 00:51:43.864289 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453dbbfd_6893_4826_92e9_8aaa7987b743.slice/crio-703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a WatchSource:0}: Error finding container 703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a: Status 404 returned error can't find the container with id 703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a Mar 08 00:51:43.903022 master-0 kubenswrapper[23041]: W0308 00:51:43.902803 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14da4175_aa45_42ff_ad82_8253a03c1697.slice/crio-14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41 WatchSource:0}: Error finding container 14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41: Status 404 returned error can't find the container with id 14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41 Mar 08 00:51:44.415342 master-0 kubenswrapper[23041]: I0308 00:51:44.415173 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-df7d-account-create-update-pbhhl"] Mar 08 00:51:44.437483 master-0 kubenswrapper[23041]: I0308 00:51:44.437416 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d80-account-create-update-dxjzn" event={"ID":"14da4175-aa45-42ff-ad82-8253a03c1697","Type":"ContainerStarted","Data":"1a83266a3676513fdbf7bb77b177696bece7737b89266bde3e3821d8350c4cfe"} Mar 08 00:51:44.437577 master-0 kubenswrapper[23041]: I0308 00:51:44.437493 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d80-account-create-update-dxjzn" event={"ID":"14da4175-aa45-42ff-ad82-8253a03c1697","Type":"ContainerStarted","Data":"14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41"} Mar 08 00:51:44.438718 master-0 kubenswrapper[23041]: I0308 00:51:44.438671 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gl66q" event={"ID":"31b5c8ab-cbab-4c6e-a021-3ba2895102bb","Type":"ContainerStarted","Data":"cba1735d0381e015df09756a0d0036f53eec2145b57bb1135fd353208f6d7bea"} Mar 08 00:51:44.441064 master-0 kubenswrapper[23041]: I0308 00:51:44.441008 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bbghv" event={"ID":"453dbbfd-6893-4826-92e9-8aaa7987b743","Type":"ContainerStarted","Data":"cdee05a1d6e89d32e93d16e518aa4e44677c016e6e54bf79abb0ea63d6d3bab7"} Mar 08 00:51:44.441144 master-0 kubenswrapper[23041]: I0308 00:51:44.441069 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bbghv" event={"ID":"453dbbfd-6893-4826-92e9-8aaa7987b743","Type":"ContainerStarted","Data":"703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a"} Mar 08 00:51:44.446338 master-0 kubenswrapper[23041]: I0308 00:51:44.446292 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df7d-account-create-update-pbhhl" event={"ID":"85b9a245-4ef0-43b9-9bf9-70c4609fda33","Type":"ContainerStarted","Data":"2d72e2607f41482ffc59623a4c97a03f8a0b9586aff786aee0b50e94b9fb7731"} Mar 08 00:51:44.472265 master-0 kubenswrapper[23041]: I0308 00:51:44.472131 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8d80-account-create-update-dxjzn" podStartSLOduration=4.472110185 podStartE2EDuration="4.472110185s" podCreationTimestamp="2026-03-08 00:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:44.46945111 +0000 UTC m=+1209.942287684" watchObservedRunningTime="2026-03-08 00:51:44.472110185 +0000 UTC m=+1209.944946749" Mar 08 00:51:44.498856 master-0 kubenswrapper[23041]: I0308 00:51:44.498755 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-bbghv" podStartSLOduration=4.498695724 podStartE2EDuration="4.498695724s" podCreationTimestamp="2026-03-08 00:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:44.496233684 +0000 UTC m=+1209.969070248" watchObservedRunningTime="2026-03-08 00:51:44.498695724 +0000 UTC m=+1209.971532278" Mar 08 00:51:44.519646 master-0 kubenswrapper[23041]: I0308 00:51:44.519560 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gl66q" podStartSLOduration=2.578271109 podStartE2EDuration="8.519541063s" podCreationTimestamp="2026-03-08 00:51:36 +0000 UTC" firstStartedPulling="2026-03-08 00:51:38.047606602 +0000 UTC m=+1203.520443156" lastFinishedPulling="2026-03-08 00:51:43.988876556 +0000 UTC m=+1209.461713110" observedRunningTime="2026-03-08 00:51:44.513485665 +0000 UTC m=+1209.986322239" watchObservedRunningTime="2026-03-08 00:51:44.519541063 +0000 UTC m=+1209.992377617" Mar 08 00:51:44.570323 master-0 kubenswrapper[23041]: I0308 00:51:44.570264 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4n48d"] Mar 08 00:51:44.575805 master-0 kubenswrapper[23041]: W0308 00:51:44.575754 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15de7491_f5c0_4ac3_b07b_6de4eac70ade.slice/crio-c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d WatchSource:0}: Error finding container c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d: Status 404 returned error can't find the container with id c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d Mar 08 00:51:44.929938 master-0 kubenswrapper[23041]: I0308 00:51:44.929819 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bm6p5"] Mar 08 00:51:44.935498 master-0 kubenswrapper[23041]: I0308 00:51:44.935430 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:44.940464 master-0 kubenswrapper[23041]: I0308 00:51:44.940394 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bm6p5"] Mar 08 00:51:44.940649 master-0 kubenswrapper[23041]: I0308 00:51:44.940572 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 00:51:44.958135 master-0 kubenswrapper[23041]: I0308 00:51:44.958067 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcvx\" (UniqueName: \"kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:44.958344 master-0 kubenswrapper[23041]: I0308 00:51:44.958163 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.059541 master-0 kubenswrapper[23041]: I0308 00:51:45.059481 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcvx\" (UniqueName: \"kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.059782 master-0 kubenswrapper[23041]: I0308 00:51:45.059562 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.060382 master-0 kubenswrapper[23041]: I0308 00:51:45.060354 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.079525 master-0 kubenswrapper[23041]: I0308 00:51:45.076308 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcvx\" (UniqueName: \"kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx\") pod \"root-account-create-update-bm6p5\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.267669 master-0 kubenswrapper[23041]: I0308 00:51:45.267610 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:45.468469 master-0 kubenswrapper[23041]: I0308 00:51:45.468314 23041 generic.go:334] "Generic (PLEG): container finished" podID="15de7491-f5c0-4ac3-b07b-6de4eac70ade" containerID="764d90934ea63856db19c5c9d69d2a73211ad9beaa4a378349a4377309b94b10" exitCode=0 Mar 08 00:51:45.468688 master-0 kubenswrapper[23041]: I0308 00:51:45.468500 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4n48d" event={"ID":"15de7491-f5c0-4ac3-b07b-6de4eac70ade","Type":"ContainerDied","Data":"764d90934ea63856db19c5c9d69d2a73211ad9beaa4a378349a4377309b94b10"} Mar 08 00:51:45.468688 master-0 kubenswrapper[23041]: I0308 00:51:45.468557 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4n48d" event={"ID":"15de7491-f5c0-4ac3-b07b-6de4eac70ade","Type":"ContainerStarted","Data":"c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d"} Mar 08 00:51:45.477984 master-0 kubenswrapper[23041]: I0308 00:51:45.476258 23041 generic.go:334] "Generic (PLEG): container finished" podID="14da4175-aa45-42ff-ad82-8253a03c1697" containerID="1a83266a3676513fdbf7bb77b177696bece7737b89266bde3e3821d8350c4cfe" exitCode=0 Mar 08 00:51:45.477984 master-0 kubenswrapper[23041]: I0308 00:51:45.476426 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d80-account-create-update-dxjzn" event={"ID":"14da4175-aa45-42ff-ad82-8253a03c1697","Type":"ContainerDied","Data":"1a83266a3676513fdbf7bb77b177696bece7737b89266bde3e3821d8350c4cfe"} Mar 08 00:51:45.477984 master-0 kubenswrapper[23041]: I0308 00:51:45.477933 23041 generic.go:334] "Generic (PLEG): container finished" podID="453dbbfd-6893-4826-92e9-8aaa7987b743" containerID="cdee05a1d6e89d32e93d16e518aa4e44677c016e6e54bf79abb0ea63d6d3bab7" exitCode=0 Mar 08 00:51:45.477984 master-0 kubenswrapper[23041]: I0308 00:51:45.477995 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bbghv" event={"ID":"453dbbfd-6893-4826-92e9-8aaa7987b743","Type":"ContainerDied","Data":"cdee05a1d6e89d32e93d16e518aa4e44677c016e6e54bf79abb0ea63d6d3bab7"} Mar 08 00:51:45.479366 master-0 kubenswrapper[23041]: I0308 00:51:45.479081 23041 generic.go:334] "Generic (PLEG): container finished" podID="85b9a245-4ef0-43b9-9bf9-70c4609fda33" containerID="49c67861ac1519e2baedb99721dd55678d3c6c75af7bd3f9557dbffad1bd428b" exitCode=0 Mar 08 00:51:45.480426 master-0 kubenswrapper[23041]: I0308 00:51:45.479960 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df7d-account-create-update-pbhhl" event={"ID":"85b9a245-4ef0-43b9-9bf9-70c4609fda33","Type":"ContainerDied","Data":"49c67861ac1519e2baedb99721dd55678d3c6c75af7bd3f9557dbffad1bd428b"} Mar 08 00:51:45.721146 master-0 kubenswrapper[23041]: I0308 00:51:45.721059 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bm6p5"] Mar 08 00:51:45.726139 master-0 kubenswrapper[23041]: W0308 00:51:45.725993 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95cdca6f_aaad_46fe_955b_bff59046b2d3.slice/crio-e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea WatchSource:0}: Error finding container e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea: Status 404 returned error can't find the container with id e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea Mar 08 00:51:45.964123 master-0 kubenswrapper[23041]: I0308 00:51:45.964068 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:51:46.042978 master-0 kubenswrapper[23041]: I0308 00:51:46.042432 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:51:46.042978 master-0 kubenswrapper[23041]: I0308 00:51:46.042696 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="dnsmasq-dns" containerID="cri-o://43fb29ba9b85458fb594b814bc115f22ce4a72799e56b1d8895270734c06d9fa" gracePeriod=10 Mar 08 00:51:46.495242 master-0 kubenswrapper[23041]: I0308 00:51:46.494843 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bm6p5" event={"ID":"95cdca6f-aaad-46fe-955b-bff59046b2d3","Type":"ContainerDied","Data":"277a8c4533f1025d947afcc3d79ecf0c0976c5908aa9ea64d7ca29cbdcceeffb"} Mar 08 00:51:46.499243 master-0 kubenswrapper[23041]: I0308 00:51:46.498330 23041 generic.go:334] "Generic (PLEG): container finished" podID="95cdca6f-aaad-46fe-955b-bff59046b2d3" containerID="277a8c4533f1025d947afcc3d79ecf0c0976c5908aa9ea64d7ca29cbdcceeffb" exitCode=0 Mar 08 00:51:46.499243 master-0 kubenswrapper[23041]: I0308 00:51:46.498561 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bm6p5" event={"ID":"95cdca6f-aaad-46fe-955b-bff59046b2d3","Type":"ContainerStarted","Data":"e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea"} Mar 08 00:51:46.500909 master-0 kubenswrapper[23041]: I0308 00:51:46.500726 23041 generic.go:334] "Generic (PLEG): container finished" podID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerID="43fb29ba9b85458fb594b814bc115f22ce4a72799e56b1d8895270734c06d9fa" exitCode=0 Mar 08 00:51:46.500909 master-0 kubenswrapper[23041]: I0308 00:51:46.500876 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" event={"ID":"c4107c15-d018-41f3-ba8d-52fec2bf1397","Type":"ContainerDied","Data":"43fb29ba9b85458fb594b814bc115f22ce4a72799e56b1d8895270734c06d9fa"} Mar 08 00:51:46.760883 master-0 kubenswrapper[23041]: I0308 00:51:46.760829 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:51:46.817993 master-0 kubenswrapper[23041]: I0308 00:51:46.817897 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzbhs\" (UniqueName: \"kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs\") pod \"c4107c15-d018-41f3-ba8d-52fec2bf1397\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " Mar 08 00:51:46.819268 master-0 kubenswrapper[23041]: I0308 00:51:46.818398 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc\") pod \"c4107c15-d018-41f3-ba8d-52fec2bf1397\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " Mar 08 00:51:46.819268 master-0 kubenswrapper[23041]: I0308 00:51:46.818476 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config\") pod \"c4107c15-d018-41f3-ba8d-52fec2bf1397\" (UID: \"c4107c15-d018-41f3-ba8d-52fec2bf1397\") " Mar 08 00:51:46.829351 master-0 kubenswrapper[23041]: I0308 00:51:46.829280 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs" (OuterVolumeSpecName: "kube-api-access-qzbhs") pod "c4107c15-d018-41f3-ba8d-52fec2bf1397" (UID: "c4107c15-d018-41f3-ba8d-52fec2bf1397"). InnerVolumeSpecName "kube-api-access-qzbhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:46.897678 master-0 kubenswrapper[23041]: I0308 00:51:46.897552 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config" (OuterVolumeSpecName: "config") pod "c4107c15-d018-41f3-ba8d-52fec2bf1397" (UID: "c4107c15-d018-41f3-ba8d-52fec2bf1397"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:46.899760 master-0 kubenswrapper[23041]: I0308 00:51:46.899426 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4107c15-d018-41f3-ba8d-52fec2bf1397" (UID: "c4107c15-d018-41f3-ba8d-52fec2bf1397"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:46.924169 master-0 kubenswrapper[23041]: I0308 00:51:46.924102 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzbhs\" (UniqueName: \"kubernetes.io/projected/c4107c15-d018-41f3-ba8d-52fec2bf1397-kube-api-access-qzbhs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:46.924169 master-0 kubenswrapper[23041]: I0308 00:51:46.924164 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:46.924169 master-0 kubenswrapper[23041]: I0308 00:51:46.924178 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4107c15-d018-41f3-ba8d-52fec2bf1397-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.085684 master-0 kubenswrapper[23041]: I0308 00:51:47.085130 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4n48d" Mar 08 00:51:47.129365 master-0 kubenswrapper[23041]: I0308 00:51:47.129179 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9plbg\" (UniqueName: \"kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg\") pod \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " Mar 08 00:51:47.129640 master-0 kubenswrapper[23041]: I0308 00:51:47.129475 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts\") pod \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\" (UID: \"15de7491-f5c0-4ac3-b07b-6de4eac70ade\") " Mar 08 00:51:47.131036 master-0 kubenswrapper[23041]: I0308 00:51:47.130986 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15de7491-f5c0-4ac3-b07b-6de4eac70ade" (UID: "15de7491-f5c0-4ac3-b07b-6de4eac70ade"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:47.135954 master-0 kubenswrapper[23041]: I0308 00:51:47.135912 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg" (OuterVolumeSpecName: "kube-api-access-9plbg") pod "15de7491-f5c0-4ac3-b07b-6de4eac70ade" (UID: "15de7491-f5c0-4ac3-b07b-6de4eac70ade"). InnerVolumeSpecName "kube-api-access-9plbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:47.234730 master-0 kubenswrapper[23041]: I0308 00:51:47.233007 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9plbg\" (UniqueName: \"kubernetes.io/projected/15de7491-f5c0-4ac3-b07b-6de4eac70ade-kube-api-access-9plbg\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.234730 master-0 kubenswrapper[23041]: I0308 00:51:47.233082 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15de7491-f5c0-4ac3-b07b-6de4eac70ade-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.495430 master-0 kubenswrapper[23041]: I0308 00:51:47.495359 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:47.503730 master-0 kubenswrapper[23041]: I0308 00:51:47.503684 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bbghv" Mar 08 00:51:47.521439 master-0 kubenswrapper[23041]: I0308 00:51:47.521401 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:47.522606 master-0 kubenswrapper[23041]: I0308 00:51:47.522503 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bbghv" event={"ID":"453dbbfd-6893-4826-92e9-8aaa7987b743","Type":"ContainerDied","Data":"703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a"} Mar 08 00:51:47.523891 master-0 kubenswrapper[23041]: I0308 00:51:47.522832 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703226af248e4ef9415472943a60acf8b1d70abf563c0a5df3f1a58468db237a" Mar 08 00:51:47.524104 master-0 kubenswrapper[23041]: I0308 00:51:47.524063 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bbghv" Mar 08 00:51:47.533247 master-0 kubenswrapper[23041]: I0308 00:51:47.532949 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df7d-account-create-update-pbhhl" event={"ID":"85b9a245-4ef0-43b9-9bf9-70c4609fda33","Type":"ContainerDied","Data":"2d72e2607f41482ffc59623a4c97a03f8a0b9586aff786aee0b50e94b9fb7731"} Mar 08 00:51:47.533247 master-0 kubenswrapper[23041]: I0308 00:51:47.532997 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d72e2607f41482ffc59623a4c97a03f8a0b9586aff786aee0b50e94b9fb7731" Mar 08 00:51:47.533247 master-0 kubenswrapper[23041]: I0308 00:51:47.533057 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df7d-account-create-update-pbhhl" Mar 08 00:51:47.536649 master-0 kubenswrapper[23041]: I0308 00:51:47.536618 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4n48d" Mar 08 00:51:47.541858 master-0 kubenswrapper[23041]: I0308 00:51:47.536632 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4n48d" event={"ID":"15de7491-f5c0-4ac3-b07b-6de4eac70ade","Type":"ContainerDied","Data":"c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d"} Mar 08 00:51:47.541966 master-0 kubenswrapper[23041]: I0308 00:51:47.541865 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c69659774819eff69f8cd44b3091e69d923800b7bce1e973dbfad11d728b8c9d" Mar 08 00:51:47.541966 master-0 kubenswrapper[23041]: I0308 00:51:47.538928 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14da4175-aa45-42ff-ad82-8253a03c1697" (UID: "14da4175-aa45-42ff-ad82-8253a03c1697"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:47.541966 master-0 kubenswrapper[23041]: I0308 00:51:47.537238 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts\") pod \"14da4175-aa45-42ff-ad82-8253a03c1697\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " Mar 08 00:51:47.542186 master-0 kubenswrapper[23041]: I0308 00:51:47.542128 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts\") pod \"453dbbfd-6893-4826-92e9-8aaa7987b743\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " Mar 08 00:51:47.542186 master-0 kubenswrapper[23041]: I0308 00:51:47.542173 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts\") pod \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " Mar 08 00:51:47.542300 master-0 kubenswrapper[23041]: I0308 00:51:47.542260 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tpj4\" (UniqueName: \"kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4\") pod \"453dbbfd-6893-4826-92e9-8aaa7987b743\" (UID: \"453dbbfd-6893-4826-92e9-8aaa7987b743\") " Mar 08 00:51:47.542963 master-0 kubenswrapper[23041]: I0308 00:51:47.542939 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz2w4\" (UniqueName: \"kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4\") pod \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\" (UID: \"85b9a245-4ef0-43b9-9bf9-70c4609fda33\") " Mar 08 00:51:47.543048 master-0 kubenswrapper[23041]: I0308 00:51:47.543000 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6znn\" (UniqueName: \"kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn\") pod \"14da4175-aa45-42ff-ad82-8253a03c1697\" (UID: \"14da4175-aa45-42ff-ad82-8253a03c1697\") " Mar 08 00:51:47.544056 master-0 kubenswrapper[23041]: I0308 00:51:47.543956 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85b9a245-4ef0-43b9-9bf9-70c4609fda33" (UID: "85b9a245-4ef0-43b9-9bf9-70c4609fda33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:47.544905 master-0 kubenswrapper[23041]: I0308 00:51:47.544826 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "453dbbfd-6893-4826-92e9-8aaa7987b743" (UID: "453dbbfd-6893-4826-92e9-8aaa7987b743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:47.546587 master-0 kubenswrapper[23041]: I0308 00:51:47.546316 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" event={"ID":"c4107c15-d018-41f3-ba8d-52fec2bf1397","Type":"ContainerDied","Data":"2aa60f68c2cb565899f7b39a893e2a1f663df64e08515cd54b1f23734028b1b4"} Mar 08 00:51:47.546587 master-0 kubenswrapper[23041]: I0308 00:51:47.546371 23041 scope.go:117] "RemoveContainer" containerID="43fb29ba9b85458fb594b814bc115f22ce4a72799e56b1d8895270734c06d9fa" Mar 08 00:51:47.546587 master-0 kubenswrapper[23041]: I0308 00:51:47.546489 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-k7sq4" Mar 08 00:51:47.546737 master-0 kubenswrapper[23041]: I0308 00:51:47.546691 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14da4175-aa45-42ff-ad82-8253a03c1697-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.547250 master-0 kubenswrapper[23041]: I0308 00:51:47.547227 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b9a245-4ef0-43b9-9bf9-70c4609fda33-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.547869 master-0 kubenswrapper[23041]: I0308 00:51:47.547723 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn" (OuterVolumeSpecName: "kube-api-access-t6znn") pod "14da4175-aa45-42ff-ad82-8253a03c1697" (UID: "14da4175-aa45-42ff-ad82-8253a03c1697"). InnerVolumeSpecName "kube-api-access-t6znn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:47.551020 master-0 kubenswrapper[23041]: I0308 00:51:47.550948 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4" (OuterVolumeSpecName: "kube-api-access-7tpj4") pod "453dbbfd-6893-4826-92e9-8aaa7987b743" (UID: "453dbbfd-6893-4826-92e9-8aaa7987b743"). InnerVolumeSpecName "kube-api-access-7tpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:47.555923 master-0 kubenswrapper[23041]: I0308 00:51:47.555699 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d80-account-create-update-dxjzn" Mar 08 00:51:47.557403 master-0 kubenswrapper[23041]: I0308 00:51:47.557135 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4" (OuterVolumeSpecName: "kube-api-access-wz2w4") pod "85b9a245-4ef0-43b9-9bf9-70c4609fda33" (UID: "85b9a245-4ef0-43b9-9bf9-70c4609fda33"). InnerVolumeSpecName "kube-api-access-wz2w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:47.566541 master-0 kubenswrapper[23041]: I0308 00:51:47.566473 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d80-account-create-update-dxjzn" event={"ID":"14da4175-aa45-42ff-ad82-8253a03c1697","Type":"ContainerDied","Data":"14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41"} Mar 08 00:51:47.566742 master-0 kubenswrapper[23041]: I0308 00:51:47.566554 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b5fb5e23e7e8c18616cc54d9f3507675e61ef3419c76c6561f622c0a774f41" Mar 08 00:51:47.580617 master-0 kubenswrapper[23041]: I0308 00:51:47.580440 23041 scope.go:117] "RemoveContainer" containerID="5babef6bc80766e64966245df29a902f1a6f8cd2d7a965f252c4a71b12f89d21" Mar 08 00:51:47.608343 master-0 kubenswrapper[23041]: I0308 00:51:47.608283 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:51:47.618773 master-0 kubenswrapper[23041]: I0308 00:51:47.618711 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-k7sq4"] Mar 08 00:51:47.649228 master-0 kubenswrapper[23041]: I0308 00:51:47.648628 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453dbbfd-6893-4826-92e9-8aaa7987b743-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.649228 master-0 kubenswrapper[23041]: I0308 00:51:47.648669 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tpj4\" (UniqueName: \"kubernetes.io/projected/453dbbfd-6893-4826-92e9-8aaa7987b743-kube-api-access-7tpj4\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.649228 master-0 kubenswrapper[23041]: I0308 00:51:47.648683 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz2w4\" (UniqueName: \"kubernetes.io/projected/85b9a245-4ef0-43b9-9bf9-70c4609fda33-kube-api-access-wz2w4\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:47.649228 master-0 kubenswrapper[23041]: I0308 00:51:47.648692 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6znn\" (UniqueName: \"kubernetes.io/projected/14da4175-aa45-42ff-ad82-8253a03c1697-kube-api-access-t6znn\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:48.052239 master-0 kubenswrapper[23041]: I0308 00:51:48.052163 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:48.056352 master-0 kubenswrapper[23041]: I0308 00:51:48.056308 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts\") pod \"95cdca6f-aaad-46fe-955b-bff59046b2d3\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " Mar 08 00:51:48.056545 master-0 kubenswrapper[23041]: I0308 00:51:48.056464 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcvx\" (UniqueName: \"kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx\") pod \"95cdca6f-aaad-46fe-955b-bff59046b2d3\" (UID: \"95cdca6f-aaad-46fe-955b-bff59046b2d3\") " Mar 08 00:51:48.057411 master-0 kubenswrapper[23041]: I0308 00:51:48.057366 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95cdca6f-aaad-46fe-955b-bff59046b2d3" (UID: "95cdca6f-aaad-46fe-955b-bff59046b2d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:48.060475 master-0 kubenswrapper[23041]: I0308 00:51:48.060407 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx" (OuterVolumeSpecName: "kube-api-access-frcvx") pod "95cdca6f-aaad-46fe-955b-bff59046b2d3" (UID: "95cdca6f-aaad-46fe-955b-bff59046b2d3"). InnerVolumeSpecName "kube-api-access-frcvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:48.158147 master-0 kubenswrapper[23041]: I0308 00:51:48.158025 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcvx\" (UniqueName: \"kubernetes.io/projected/95cdca6f-aaad-46fe-955b-bff59046b2d3-kube-api-access-frcvx\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:48.158147 master-0 kubenswrapper[23041]: I0308 00:51:48.158064 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cdca6f-aaad-46fe-955b-bff59046b2d3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:48.585813 master-0 kubenswrapper[23041]: I0308 00:51:48.584935 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bm6p5" event={"ID":"95cdca6f-aaad-46fe-955b-bff59046b2d3","Type":"ContainerDied","Data":"e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea"} Mar 08 00:51:48.585813 master-0 kubenswrapper[23041]: I0308 00:51:48.585800 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e46142d5cd6f88f630559574423ff95e7de4d1026929f97b44f7f53fd64812ea" Mar 08 00:51:48.585813 master-0 kubenswrapper[23041]: I0308 00:51:48.585011 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bm6p5" Mar 08 00:51:48.822521 master-0 kubenswrapper[23041]: I0308 00:51:48.822483 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" path="/var/lib/kubelet/pods/c4107c15-d018-41f3-ba8d-52fec2bf1397/volumes" Mar 08 00:51:50.241926 master-0 kubenswrapper[23041]: I0308 00:51:50.241825 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cl2fb"] Mar 08 00:51:50.242582 master-0 kubenswrapper[23041]: E0308 00:51:50.242522 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15de7491-f5c0-4ac3-b07b-6de4eac70ade" containerName="mariadb-database-create" Mar 08 00:51:50.242582 master-0 kubenswrapper[23041]: I0308 00:51:50.242542 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="15de7491-f5c0-4ac3-b07b-6de4eac70ade" containerName="mariadb-database-create" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: E0308 00:51:50.242593 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14da4175-aa45-42ff-ad82-8253a03c1697" containerName="mariadb-account-create-update" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: I0308 00:51:50.242603 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="14da4175-aa45-42ff-ad82-8253a03c1697" containerName="mariadb-account-create-update" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: E0308 00:51:50.242621 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="init" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: I0308 00:51:50.242630 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="init" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: E0308 00:51:50.242648 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="dnsmasq-dns" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: I0308 00:51:50.242657 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="dnsmasq-dns" Mar 08 00:51:50.242674 master-0 kubenswrapper[23041]: E0308 00:51:50.242677 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453dbbfd-6893-4826-92e9-8aaa7987b743" containerName="mariadb-database-create" Mar 08 00:51:50.242967 master-0 kubenswrapper[23041]: I0308 00:51:50.242686 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="453dbbfd-6893-4826-92e9-8aaa7987b743" containerName="mariadb-database-create" Mar 08 00:51:50.242967 master-0 kubenswrapper[23041]: E0308 00:51:50.242701 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95cdca6f-aaad-46fe-955b-bff59046b2d3" containerName="mariadb-account-create-update" Mar 08 00:51:50.242967 master-0 kubenswrapper[23041]: I0308 00:51:50.242710 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="95cdca6f-aaad-46fe-955b-bff59046b2d3" containerName="mariadb-account-create-update" Mar 08 00:51:50.242967 master-0 kubenswrapper[23041]: E0308 00:51:50.242722 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b9a245-4ef0-43b9-9bf9-70c4609fda33" containerName="mariadb-account-create-update" Mar 08 00:51:50.242967 master-0 kubenswrapper[23041]: I0308 00:51:50.242730 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b9a245-4ef0-43b9-9bf9-70c4609fda33" containerName="mariadb-account-create-update" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.242984 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="15de7491-f5c0-4ac3-b07b-6de4eac70ade" containerName="mariadb-database-create" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.242999 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="453dbbfd-6893-4826-92e9-8aaa7987b743" containerName="mariadb-database-create" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.243020 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="95cdca6f-aaad-46fe-955b-bff59046b2d3" containerName="mariadb-account-create-update" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.243032 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4107c15-d018-41f3-ba8d-52fec2bf1397" containerName="dnsmasq-dns" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.243057 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="14da4175-aa45-42ff-ad82-8253a03c1697" containerName="mariadb-account-create-update" Mar 08 00:51:50.243158 master-0 kubenswrapper[23041]: I0308 00:51:50.243069 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b9a245-4ef0-43b9-9bf9-70c4609fda33" containerName="mariadb-account-create-update" Mar 08 00:51:50.244028 master-0 kubenswrapper[23041]: I0308 00:51:50.243989 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.305526 master-0 kubenswrapper[23041]: I0308 00:51:50.305439 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:51:50.305960 master-0 kubenswrapper[23041]: E0308 00:51:50.305916 23041 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 00:51:50.305960 master-0 kubenswrapper[23041]: E0308 00:51:50.305950 23041 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 00:51:50.306117 master-0 kubenswrapper[23041]: E0308 00:51:50.306018 23041 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift podName:724cd646-a717-4778-82a6-9471c70e13c5 nodeName:}" failed. No retries permitted until 2026-03-08 00:52:06.305993067 +0000 UTC m=+1231.778829641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift") pod "swift-storage-0" (UID: "724cd646-a717-4778-82a6-9471c70e13c5") : configmap "swift-ring-files" not found Mar 08 00:51:50.407821 master-0 kubenswrapper[23041]: I0308 00:51:50.407740 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85zc\" (UniqueName: \"kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.408240 master-0 kubenswrapper[23041]: I0308 00:51:50.408156 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.510058 master-0 kubenswrapper[23041]: I0308 00:51:50.509989 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.510517 master-0 kubenswrapper[23041]: I0308 00:51:50.510484 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r85zc\" (UniqueName: \"kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.511342 master-0 kubenswrapper[23041]: I0308 00:51:50.511284 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.565740 master-0 kubenswrapper[23041]: I0308 00:51:50.565685 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85zc\" (UniqueName: \"kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc\") pod \"keystone-db-create-cl2fb\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.597892 master-0 kubenswrapper[23041]: I0308 00:51:50.597275 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cl2fb"] Mar 08 00:51:50.630975 master-0 kubenswrapper[23041]: I0308 00:51:50.630897 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4837-account-create-update-9nldk"] Mar 08 00:51:50.633595 master-0 kubenswrapper[23041]: I0308 00:51:50.633553 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.636404 master-0 kubenswrapper[23041]: I0308 00:51:50.636355 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 00:51:50.641982 master-0 kubenswrapper[23041]: I0308 00:51:50.641893 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4837-account-create-update-9nldk"] Mar 08 00:51:50.818513 master-0 kubenswrapper[23041]: I0308 00:51:50.817838 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.818513 master-0 kubenswrapper[23041]: I0308 00:51:50.818138 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtllt\" (UniqueName: \"kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.865675 master-0 kubenswrapper[23041]: I0308 00:51:50.865602 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:50.923328 master-0 kubenswrapper[23041]: I0308 00:51:50.920310 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.923328 master-0 kubenswrapper[23041]: I0308 00:51:50.920523 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtllt\" (UniqueName: \"kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.923328 master-0 kubenswrapper[23041]: I0308 00:51:50.922579 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.939731 master-0 kubenswrapper[23041]: I0308 00:51:50.939683 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtllt\" (UniqueName: \"kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt\") pod \"keystone-4837-account-create-update-9nldk\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:50.959750 master-0 kubenswrapper[23041]: I0308 00:51:50.959687 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:51.388375 master-0 kubenswrapper[23041]: I0308 00:51:51.385832 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cl2fb"] Mar 08 00:51:51.514518 master-0 kubenswrapper[23041]: I0308 00:51:51.514259 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4837-account-create-update-9nldk"] Mar 08 00:51:51.619916 master-0 kubenswrapper[23041]: I0308 00:51:51.618908 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cl2fb" event={"ID":"fed95fb2-570e-4b8a-9ddb-697e3a3606a8","Type":"ContainerStarted","Data":"fd3a0f11f9c83f168cd37aa9c13497cb51edcfc07076cd97a64e6e4dfd6f6624"} Mar 08 00:51:51.619916 master-0 kubenswrapper[23041]: I0308 00:51:51.618970 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cl2fb" event={"ID":"fed95fb2-570e-4b8a-9ddb-697e3a3606a8","Type":"ContainerStarted","Data":"c730abf33befcbce5d249e14785bdca3598117e50a00328ff3f8d6125de890c8"} Mar 08 00:51:51.624827 master-0 kubenswrapper[23041]: I0308 00:51:51.624775 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4837-account-create-update-9nldk" event={"ID":"240177ba-2f39-4ab2-a12c-a4c545a9fb1a","Type":"ContainerStarted","Data":"f2ea8a9c7243441e89b0b436b3d9bf9a2f6ad89412be75b8fd35a23e60caad91"} Mar 08 00:51:51.646526 master-0 kubenswrapper[23041]: I0308 00:51:51.646427 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-cl2fb" podStartSLOduration=1.646405885 podStartE2EDuration="1.646405885s" podCreationTimestamp="2026-03-08 00:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:51:51.636251197 +0000 UTC m=+1217.109087811" watchObservedRunningTime="2026-03-08 00:51:51.646405885 +0000 UTC m=+1217.119242439" Mar 08 00:51:52.335884 master-0 kubenswrapper[23041]: I0308 00:51:52.335805 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wtgz7"] Mar 08 00:51:52.337492 master-0 kubenswrapper[23041]: I0308 00:51:52.337456 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.341904 master-0 kubenswrapper[23041]: I0308 00:51:52.341856 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-config-data" Mar 08 00:51:52.385223 master-0 kubenswrapper[23041]: I0308 00:51:52.385140 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wtgz7"] Mar 08 00:51:52.386345 master-0 kubenswrapper[23041]: I0308 00:51:52.386315 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.386430 master-0 kubenswrapper[23041]: I0308 00:51:52.386386 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prs5n\" (UniqueName: \"kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.386506 master-0 kubenswrapper[23041]: I0308 00:51:52.386484 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.386574 master-0 kubenswrapper[23041]: I0308 00:51:52.386553 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.488220 master-0 kubenswrapper[23041]: I0308 00:51:52.488123 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.488862 master-0 kubenswrapper[23041]: I0308 00:51:52.488544 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prs5n\" (UniqueName: \"kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.489175 master-0 kubenswrapper[23041]: I0308 00:51:52.489129 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.489518 master-0 kubenswrapper[23041]: I0308 00:51:52.489495 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.493669 master-0 kubenswrapper[23041]: I0308 00:51:52.493621 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.494442 master-0 kubenswrapper[23041]: I0308 00:51:52.494372 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.498060 master-0 kubenswrapper[23041]: I0308 00:51:52.496771 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.510268 master-0 kubenswrapper[23041]: I0308 00:51:52.509988 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prs5n\" (UniqueName: \"kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n\") pod \"glance-db-sync-wtgz7\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:52.637049 master-0 kubenswrapper[23041]: I0308 00:51:52.636917 23041 generic.go:334] "Generic (PLEG): container finished" podID="240177ba-2f39-4ab2-a12c-a4c545a9fb1a" containerID="55f9e26ff2aba11c8a815eccae80d4332603db3e403b21484464996b6f214b29" exitCode=0 Mar 08 00:51:52.637342 master-0 kubenswrapper[23041]: I0308 00:51:52.637035 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4837-account-create-update-9nldk" event={"ID":"240177ba-2f39-4ab2-a12c-a4c545a9fb1a","Type":"ContainerDied","Data":"55f9e26ff2aba11c8a815eccae80d4332603db3e403b21484464996b6f214b29"} Mar 08 00:51:52.639237 master-0 kubenswrapper[23041]: I0308 00:51:52.639166 23041 generic.go:334] "Generic (PLEG): container finished" podID="31b5c8ab-cbab-4c6e-a021-3ba2895102bb" containerID="cba1735d0381e015df09756a0d0036f53eec2145b57bb1135fd353208f6d7bea" exitCode=0 Mar 08 00:51:52.639331 master-0 kubenswrapper[23041]: I0308 00:51:52.639291 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gl66q" event={"ID":"31b5c8ab-cbab-4c6e-a021-3ba2895102bb","Type":"ContainerDied","Data":"cba1735d0381e015df09756a0d0036f53eec2145b57bb1135fd353208f6d7bea"} Mar 08 00:51:52.641542 master-0 kubenswrapper[23041]: I0308 00:51:52.641504 23041 generic.go:334] "Generic (PLEG): container finished" podID="fed95fb2-570e-4b8a-9ddb-697e3a3606a8" containerID="fd3a0f11f9c83f168cd37aa9c13497cb51edcfc07076cd97a64e6e4dfd6f6624" exitCode=0 Mar 08 00:51:52.641625 master-0 kubenswrapper[23041]: I0308 00:51:52.641554 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cl2fb" event={"ID":"fed95fb2-570e-4b8a-9ddb-697e3a3606a8","Type":"ContainerDied","Data":"fd3a0f11f9c83f168cd37aa9c13497cb51edcfc07076cd97a64e6e4dfd6f6624"} Mar 08 00:51:52.656692 master-0 kubenswrapper[23041]: I0308 00:51:52.656634 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtgz7" Mar 08 00:51:53.272191 master-0 kubenswrapper[23041]: I0308 00:51:53.272132 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wtgz7"] Mar 08 00:51:53.653388 master-0 kubenswrapper[23041]: I0308 00:51:53.653288 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtgz7" event={"ID":"32cca4f7-a751-48d2-b93f-211bb7f12697","Type":"ContainerStarted","Data":"5d57ab35c9d1f89da60ae57586f8f57b0f5f3929d5606a0c7bc50c7126d40b9d"} Mar 08 00:51:54.053965 master-0 kubenswrapper[23041]: I0308 00:51:54.053873 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:54.128456 master-0 kubenswrapper[23041]: I0308 00:51:54.128399 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.128917 master-0 kubenswrapper[23041]: I0308 00:51:54.128897 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.129174 master-0 kubenswrapper[23041]: I0308 00:51:54.129158 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.129426 master-0 kubenswrapper[23041]: I0308 00:51:54.129401 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ptt9\" (UniqueName: \"kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.129639 master-0 kubenswrapper[23041]: I0308 00:51:54.129622 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.129856 master-0 kubenswrapper[23041]: I0308 00:51:54.129840 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.130087 master-0 kubenswrapper[23041]: I0308 00:51:54.130071 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices\") pod \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\" (UID: \"31b5c8ab-cbab-4c6e-a021-3ba2895102bb\") " Mar 08 00:51:54.130738 master-0 kubenswrapper[23041]: I0308 00:51:54.130682 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:54.131383 master-0 kubenswrapper[23041]: I0308 00:51:54.131338 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:51:54.131929 master-0 kubenswrapper[23041]: I0308 00:51:54.131903 23041 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.133165 master-0 kubenswrapper[23041]: I0308 00:51:54.133111 23041 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.135951 master-0 kubenswrapper[23041]: I0308 00:51:54.135886 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9" (OuterVolumeSpecName: "kube-api-access-6ptt9") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "kube-api-access-6ptt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:54.138432 master-0 kubenswrapper[23041]: I0308 00:51:54.138362 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:51:54.155425 master-0 kubenswrapper[23041]: I0308 00:51:54.155367 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:51:54.168500 master-0 kubenswrapper[23041]: I0308 00:51:54.168440 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts" (OuterVolumeSpecName: "scripts") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:54.171095 master-0 kubenswrapper[23041]: I0308 00:51:54.171035 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "31b5c8ab-cbab-4c6e-a021-3ba2895102bb" (UID: "31b5c8ab-cbab-4c6e-a021-3ba2895102bb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:51:54.235917 master-0 kubenswrapper[23041]: I0308 00:51:54.235847 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.235917 master-0 kubenswrapper[23041]: I0308 00:51:54.235889 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.235917 master-0 kubenswrapper[23041]: I0308 00:51:54.235901 23041 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.235917 master-0 kubenswrapper[23041]: I0308 00:51:54.235912 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ptt9\" (UniqueName: \"kubernetes.io/projected/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-kube-api-access-6ptt9\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.235917 master-0 kubenswrapper[23041]: I0308 00:51:54.235921 23041 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31b5c8ab-cbab-4c6e-a021-3ba2895102bb-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.265073 master-0 kubenswrapper[23041]: I0308 00:51:54.265018 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8nzj6" podUID="4aeedb5e-326b-4294-a7be-9569b908b49c" containerName="ovn-controller" probeResult="failure" output=< Mar 08 00:51:54.265073 master-0 kubenswrapper[23041]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 00:51:54.265073 master-0 kubenswrapper[23041]: > Mar 08 00:51:54.297053 master-0 kubenswrapper[23041]: I0308 00:51:54.297003 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:54.346463 master-0 kubenswrapper[23041]: I0308 00:51:54.339492 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r85zc\" (UniqueName: \"kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc\") pod \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " Mar 08 00:51:54.346463 master-0 kubenswrapper[23041]: I0308 00:51:54.339574 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts\") pod \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\" (UID: \"fed95fb2-570e-4b8a-9ddb-697e3a3606a8\") " Mar 08 00:51:54.346463 master-0 kubenswrapper[23041]: I0308 00:51:54.340880 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fed95fb2-570e-4b8a-9ddb-697e3a3606a8" (UID: "fed95fb2-570e-4b8a-9ddb-697e3a3606a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:54.346463 master-0 kubenswrapper[23041]: I0308 00:51:54.346323 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc" (OuterVolumeSpecName: "kube-api-access-r85zc") pod "fed95fb2-570e-4b8a-9ddb-697e3a3606a8" (UID: "fed95fb2-570e-4b8a-9ddb-697e3a3606a8"). InnerVolumeSpecName "kube-api-access-r85zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:54.402333 master-0 kubenswrapper[23041]: I0308 00:51:54.402273 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bm6p5"] Mar 08 00:51:54.412152 master-0 kubenswrapper[23041]: I0308 00:51:54.412101 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bm6p5"] Mar 08 00:51:54.427493 master-0 kubenswrapper[23041]: I0308 00:51:54.427455 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:54.449363 master-0 kubenswrapper[23041]: I0308 00:51:54.447123 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r85zc\" (UniqueName: \"kubernetes.io/projected/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-kube-api-access-r85zc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.449363 master-0 kubenswrapper[23041]: I0308 00:51:54.447181 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed95fb2-570e-4b8a-9ddb-697e3a3606a8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.549584 master-0 kubenswrapper[23041]: I0308 00:51:54.549462 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts\") pod \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " Mar 08 00:51:54.549823 master-0 kubenswrapper[23041]: I0308 00:51:54.549582 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtllt\" (UniqueName: \"kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt\") pod \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\" (UID: \"240177ba-2f39-4ab2-a12c-a4c545a9fb1a\") " Mar 08 00:51:54.550930 master-0 kubenswrapper[23041]: I0308 00:51:54.550846 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "240177ba-2f39-4ab2-a12c-a4c545a9fb1a" (UID: "240177ba-2f39-4ab2-a12c-a4c545a9fb1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:51:54.553990 master-0 kubenswrapper[23041]: I0308 00:51:54.553962 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt" (OuterVolumeSpecName: "kube-api-access-dtllt") pod "240177ba-2f39-4ab2-a12c-a4c545a9fb1a" (UID: "240177ba-2f39-4ab2-a12c-a4c545a9fb1a"). InnerVolumeSpecName "kube-api-access-dtllt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:51:54.653800 master-0 kubenswrapper[23041]: I0308 00:51:54.653704 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.653800 master-0 kubenswrapper[23041]: I0308 00:51:54.653753 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtllt\" (UniqueName: \"kubernetes.io/projected/240177ba-2f39-4ab2-a12c-a4c545a9fb1a-kube-api-access-dtllt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:51:54.665237 master-0 kubenswrapper[23041]: I0308 00:51:54.665166 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gl66q" event={"ID":"31b5c8ab-cbab-4c6e-a021-3ba2895102bb","Type":"ContainerDied","Data":"723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498"} Mar 08 00:51:54.665237 master-0 kubenswrapper[23041]: I0308 00:51:54.665232 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gl66q" Mar 08 00:51:54.665467 master-0 kubenswrapper[23041]: I0308 00:51:54.665242 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723aad89168391b4c721cf32ead0cb512d22003ebdec722084e5639daff3b498" Mar 08 00:51:54.666849 master-0 kubenswrapper[23041]: I0308 00:51:54.666772 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cl2fb" Mar 08 00:51:54.666952 master-0 kubenswrapper[23041]: I0308 00:51:54.666792 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cl2fb" event={"ID":"fed95fb2-570e-4b8a-9ddb-697e3a3606a8","Type":"ContainerDied","Data":"c730abf33befcbce5d249e14785bdca3598117e50a00328ff3f8d6125de890c8"} Mar 08 00:51:54.667016 master-0 kubenswrapper[23041]: I0308 00:51:54.666960 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c730abf33befcbce5d249e14785bdca3598117e50a00328ff3f8d6125de890c8" Mar 08 00:51:54.668128 master-0 kubenswrapper[23041]: I0308 00:51:54.668065 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4837-account-create-update-9nldk" event={"ID":"240177ba-2f39-4ab2-a12c-a4c545a9fb1a","Type":"ContainerDied","Data":"f2ea8a9c7243441e89b0b436b3d9bf9a2f6ad89412be75b8fd35a23e60caad91"} Mar 08 00:51:54.668128 master-0 kubenswrapper[23041]: I0308 00:51:54.668120 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ea8a9c7243441e89b0b436b3d9bf9a2f6ad89412be75b8fd35a23e60caad91" Mar 08 00:51:54.668128 master-0 kubenswrapper[23041]: I0308 00:51:54.668096 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4837-account-create-update-9nldk" Mar 08 00:51:54.821241 master-0 kubenswrapper[23041]: I0308 00:51:54.821140 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95cdca6f-aaad-46fe-955b-bff59046b2d3" path="/var/lib/kubelet/pods/95cdca6f-aaad-46fe-955b-bff59046b2d3/volumes" Mar 08 00:51:56.296067 master-0 kubenswrapper[23041]: I0308 00:51:56.295983 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 00:51:57.712382 master-0 kubenswrapper[23041]: I0308 00:51:57.712192 23041 generic.go:334] "Generic (PLEG): container finished" podID="e610ec98-66ae-412c-bab9-fab6413ef654" containerID="fa9227ea2884fa39ea153bc0c495216a912d5ade6b4f4c415d2cf56ba2bbbe26" exitCode=0 Mar 08 00:51:57.712382 master-0 kubenswrapper[23041]: I0308 00:51:57.712298 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e610ec98-66ae-412c-bab9-fab6413ef654","Type":"ContainerDied","Data":"fa9227ea2884fa39ea153bc0c495216a912d5ade6b4f4c415d2cf56ba2bbbe26"} Mar 08 00:51:57.716150 master-0 kubenswrapper[23041]: I0308 00:51:57.715018 23041 generic.go:334] "Generic (PLEG): container finished" podID="973c5591-ef0e-4a00-9107-bf5c09b1782d" containerID="7f556a88094c105c61e23407179be8573353dc0602f0f7583b69c9ecf703b324" exitCode=0 Mar 08 00:51:57.716150 master-0 kubenswrapper[23041]: I0308 00:51:57.715053 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"973c5591-ef0e-4a00-9107-bf5c09b1782d","Type":"ContainerDied","Data":"7f556a88094c105c61e23407179be8573353dc0602f0f7583b69c9ecf703b324"} Mar 08 00:51:58.730906 master-0 kubenswrapper[23041]: I0308 00:51:58.730845 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"973c5591-ef0e-4a00-9107-bf5c09b1782d","Type":"ContainerStarted","Data":"a6e5c9f39e7b990656544f2d718ded6fe2fe254ec23b4c81d2b4b7e9e9a27c08"} Mar 08 00:51:58.731582 master-0 kubenswrapper[23041]: I0308 00:51:58.731153 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:51:58.734975 master-0 kubenswrapper[23041]: I0308 00:51:58.734924 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e610ec98-66ae-412c-bab9-fab6413ef654","Type":"ContainerStarted","Data":"7027828ed7f3b661974176e2e8dd1a3b163b67dc48bb3b4f752355a7bc5b62cf"} Mar 08 00:51:58.735188 master-0 kubenswrapper[23041]: I0308 00:51:58.735147 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 00:51:58.815580 master-0 kubenswrapper[23041]: I0308 00:51:58.815515 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.628867164 podStartE2EDuration="1m5.815497029s" podCreationTimestamp="2026-03-08 00:50:53 +0000 UTC" firstStartedPulling="2026-03-08 00:51:14.375956224 +0000 UTC m=+1179.848792778" lastFinishedPulling="2026-03-08 00:51:23.562586089 +0000 UTC m=+1189.035422643" observedRunningTime="2026-03-08 00:51:58.812579797 +0000 UTC m=+1224.285416371" watchObservedRunningTime="2026-03-08 00:51:58.815497029 +0000 UTC m=+1224.288333583" Mar 08 00:51:58.983809 master-0 kubenswrapper[23041]: I0308 00:51:58.983674 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.360325978 podStartE2EDuration="1m4.983653125s" podCreationTimestamp="2026-03-08 00:50:54 +0000 UTC" firstStartedPulling="2026-03-08 00:51:13.949582744 +0000 UTC m=+1179.422419308" lastFinishedPulling="2026-03-08 00:51:23.572909901 +0000 UTC m=+1189.045746455" observedRunningTime="2026-03-08 00:51:58.983558912 +0000 UTC m=+1224.456395476" watchObservedRunningTime="2026-03-08 00:51:58.983653125 +0000 UTC m=+1224.456489679" Mar 08 00:51:59.259240 master-0 kubenswrapper[23041]: I0308 00:51:59.257505 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8nzj6" podUID="4aeedb5e-326b-4294-a7be-9569b908b49c" containerName="ovn-controller" probeResult="failure" output=< Mar 08 00:51:59.259240 master-0 kubenswrapper[23041]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 00:51:59.259240 master-0 kubenswrapper[23041]: > Mar 08 00:51:59.335276 master-0 kubenswrapper[23041]: I0308 00:51:59.335162 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:59.400065 master-0 kubenswrapper[23041]: I0308 00:51:59.399967 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2cgqz" Mar 08 00:51:59.928630 master-0 kubenswrapper[23041]: I0308 00:51:59.928539 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-468nq"] Mar 08 00:51:59.929146 master-0 kubenswrapper[23041]: E0308 00:51:59.929098 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b5c8ab-cbab-4c6e-a021-3ba2895102bb" containerName="swift-ring-rebalance" Mar 08 00:51:59.929146 master-0 kubenswrapper[23041]: I0308 00:51:59.929115 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b5c8ab-cbab-4c6e-a021-3ba2895102bb" containerName="swift-ring-rebalance" Mar 08 00:51:59.929239 master-0 kubenswrapper[23041]: E0308 00:51:59.929158 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed95fb2-570e-4b8a-9ddb-697e3a3606a8" containerName="mariadb-database-create" Mar 08 00:51:59.929239 master-0 kubenswrapper[23041]: I0308 00:51:59.929166 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed95fb2-570e-4b8a-9ddb-697e3a3606a8" containerName="mariadb-database-create" Mar 08 00:51:59.929239 master-0 kubenswrapper[23041]: E0308 00:51:59.929189 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240177ba-2f39-4ab2-a12c-a4c545a9fb1a" containerName="mariadb-account-create-update" Mar 08 00:51:59.929239 master-0 kubenswrapper[23041]: I0308 00:51:59.929196 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="240177ba-2f39-4ab2-a12c-a4c545a9fb1a" containerName="mariadb-account-create-update" Mar 08 00:51:59.929479 master-0 kubenswrapper[23041]: I0308 00:51:59.929405 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed95fb2-570e-4b8a-9ddb-697e3a3606a8" containerName="mariadb-database-create" Mar 08 00:51:59.929479 master-0 kubenswrapper[23041]: I0308 00:51:59.929429 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="240177ba-2f39-4ab2-a12c-a4c545a9fb1a" containerName="mariadb-account-create-update" Mar 08 00:51:59.929479 master-0 kubenswrapper[23041]: I0308 00:51:59.929458 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b5c8ab-cbab-4c6e-a021-3ba2895102bb" containerName="swift-ring-rebalance" Mar 08 00:51:59.930290 master-0 kubenswrapper[23041]: I0308 00:51:59.930268 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-468nq" Mar 08 00:51:59.932742 master-0 kubenswrapper[23041]: I0308 00:51:59.932699 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 00:51:59.939236 master-0 kubenswrapper[23041]: I0308 00:51:59.939155 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-468nq"] Mar 08 00:52:00.111106 master-0 kubenswrapper[23041]: I0308 00:52:00.110797 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8nzj6-config-kblpr"] Mar 08 00:52:00.112637 master-0 kubenswrapper[23041]: I0308 00:52:00.112608 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.116950 master-0 kubenswrapper[23041]: I0308 00:52:00.114956 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 00:52:00.118113 master-0 kubenswrapper[23041]: I0308 00:52:00.118058 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbkv\" (UniqueName: \"kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.118187 master-0 kubenswrapper[23041]: I0308 00:52:00.118134 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.126510 master-0 kubenswrapper[23041]: I0308 00:52:00.126455 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8nzj6-config-kblpr"] Mar 08 00:52:00.220143 master-0 kubenswrapper[23041]: I0308 00:52:00.220017 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bc5w\" (UniqueName: \"kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.220143 master-0 kubenswrapper[23041]: I0308 00:52:00.220071 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.220143 master-0 kubenswrapper[23041]: I0308 00:52:00.220119 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbkv\" (UniqueName: \"kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.220143 master-0 kubenswrapper[23041]: I0308 00:52:00.220142 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.220475 master-0 kubenswrapper[23041]: I0308 00:52:00.220165 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.220475 master-0 kubenswrapper[23041]: I0308 00:52:00.220185 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.220475 master-0 kubenswrapper[23041]: I0308 00:52:00.220300 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.220475 master-0 kubenswrapper[23041]: I0308 00:52:00.220365 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.221388 master-0 kubenswrapper[23041]: I0308 00:52:00.221347 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.237798 master-0 kubenswrapper[23041]: I0308 00:52:00.237761 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbkv\" (UniqueName: \"kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv\") pod \"root-account-create-update-468nq\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.269889 master-0 kubenswrapper[23041]: I0308 00:52:00.269815 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-468nq" Mar 08 00:52:00.323006 master-0 kubenswrapper[23041]: I0308 00:52:00.322900 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bc5w\" (UniqueName: \"kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323006 master-0 kubenswrapper[23041]: I0308 00:52:00.322995 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323287 master-0 kubenswrapper[23041]: I0308 00:52:00.323117 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323458 master-0 kubenswrapper[23041]: I0308 00:52:00.323420 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323669 master-0 kubenswrapper[23041]: I0308 00:52:00.323630 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323736 master-0 kubenswrapper[23041]: I0308 00:52:00.323682 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323786 master-0 kubenswrapper[23041]: I0308 00:52:00.323733 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.323786 master-0 kubenswrapper[23041]: I0308 00:52:00.323771 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.324352 master-0 kubenswrapper[23041]: I0308 00:52:00.324312 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.324442 master-0 kubenswrapper[23041]: I0308 00:52:00.324374 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.325995 master-0 kubenswrapper[23041]: I0308 00:52:00.325630 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.341285 master-0 kubenswrapper[23041]: I0308 00:52:00.341223 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bc5w\" (UniqueName: \"kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w\") pod \"ovn-controller-8nzj6-config-kblpr\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:00.456791 master-0 kubenswrapper[23041]: I0308 00:52:00.456741 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:04.235831 master-0 kubenswrapper[23041]: I0308 00:52:04.235333 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8nzj6" podUID="4aeedb5e-326b-4294-a7be-9569b908b49c" containerName="ovn-controller" probeResult="failure" output=< Mar 08 00:52:04.235831 master-0 kubenswrapper[23041]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 00:52:04.235831 master-0 kubenswrapper[23041]: > Mar 08 00:52:06.387445 master-0 kubenswrapper[23041]: I0308 00:52:06.386404 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:52:06.396846 master-0 kubenswrapper[23041]: I0308 00:52:06.390724 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/724cd646-a717-4778-82a6-9471c70e13c5-etc-swift\") pod \"swift-storage-0\" (UID: \"724cd646-a717-4778-82a6-9471c70e13c5\") " pod="openstack/swift-storage-0" Mar 08 00:52:06.537846 master-0 kubenswrapper[23041]: I0308 00:52:06.537323 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 00:52:06.678518 master-0 kubenswrapper[23041]: W0308 00:52:06.678447 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87454c8d_819b_4ee6_8291_ccdf7c81f77b.slice/crio-fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951 WatchSource:0}: Error finding container fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951: Status 404 returned error can't find the container with id fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951 Mar 08 00:52:06.687011 master-0 kubenswrapper[23041]: I0308 00:52:06.686956 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-468nq"] Mar 08 00:52:06.800240 master-0 kubenswrapper[23041]: I0308 00:52:06.800180 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8nzj6-config-kblpr"] Mar 08 00:52:06.853713 master-0 kubenswrapper[23041]: I0308 00:52:06.852444 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8nzj6-config-kblpr" event={"ID":"484a3fe3-0512-40e5-bce4-5cb8e182664d","Type":"ContainerStarted","Data":"10417d6760c62878b3b47624169db55c76afb67130c7f4119a65d33a20f81c40"} Mar 08 00:52:06.853898 master-0 kubenswrapper[23041]: I0308 00:52:06.853748 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-468nq" event={"ID":"87454c8d-819b-4ee6-8291-ccdf7c81f77b","Type":"ContainerStarted","Data":"66c04aae6df2f2eefabc570610e2b0c561df8682462bc8b952a89ec422e4a85a"} Mar 08 00:52:06.853898 master-0 kubenswrapper[23041]: I0308 00:52:06.853775 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-468nq" event={"ID":"87454c8d-819b-4ee6-8291-ccdf7c81f77b","Type":"ContainerStarted","Data":"fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951"} Mar 08 00:52:06.885226 master-0 kubenswrapper[23041]: I0308 00:52:06.885140 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-468nq" podStartSLOduration=7.88511771 podStartE2EDuration="7.88511771s" podCreationTimestamp="2026-03-08 00:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:06.875530716 +0000 UTC m=+1232.348367270" watchObservedRunningTime="2026-03-08 00:52:06.88511771 +0000 UTC m=+1232.357954264" Mar 08 00:52:07.020357 master-0 kubenswrapper[23041]: I0308 00:52:07.020318 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 00:52:07.025700 master-0 kubenswrapper[23041]: W0308 00:52:07.025650 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724cd646_a717_4778_82a6_9471c70e13c5.slice/crio-e3a9d6b8a5c811bc921e5aa5717b1b7b42dca11ed9f86da60d6b280b20331e57 WatchSource:0}: Error finding container e3a9d6b8a5c811bc921e5aa5717b1b7b42dca11ed9f86da60d6b280b20331e57: Status 404 returned error can't find the container with id e3a9d6b8a5c811bc921e5aa5717b1b7b42dca11ed9f86da60d6b280b20331e57 Mar 08 00:52:07.869890 master-0 kubenswrapper[23041]: I0308 00:52:07.869826 23041 generic.go:334] "Generic (PLEG): container finished" podID="87454c8d-819b-4ee6-8291-ccdf7c81f77b" containerID="66c04aae6df2f2eefabc570610e2b0c561df8682462bc8b952a89ec422e4a85a" exitCode=0 Mar 08 00:52:07.870624 master-0 kubenswrapper[23041]: I0308 00:52:07.869908 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-468nq" event={"ID":"87454c8d-819b-4ee6-8291-ccdf7c81f77b","Type":"ContainerDied","Data":"66c04aae6df2f2eefabc570610e2b0c561df8682462bc8b952a89ec422e4a85a"} Mar 08 00:52:07.877052 master-0 kubenswrapper[23041]: I0308 00:52:07.876976 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"e3a9d6b8a5c811bc921e5aa5717b1b7b42dca11ed9f86da60d6b280b20331e57"} Mar 08 00:52:07.880604 master-0 kubenswrapper[23041]: I0308 00:52:07.880532 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtgz7" event={"ID":"32cca4f7-a751-48d2-b93f-211bb7f12697","Type":"ContainerStarted","Data":"3f6fabd754937090775079da9396a4489b34d8439d2e4928f0a252790ba96dea"} Mar 08 00:52:07.894040 master-0 kubenswrapper[23041]: I0308 00:52:07.893928 23041 generic.go:334] "Generic (PLEG): container finished" podID="484a3fe3-0512-40e5-bce4-5cb8e182664d" containerID="b5cc00b2a62bf8b551efcd4ca48fe56eb34d7ac2ce41eaf49cceecb283aedcf2" exitCode=0 Mar 08 00:52:07.894040 master-0 kubenswrapper[23041]: I0308 00:52:07.893998 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8nzj6-config-kblpr" event={"ID":"484a3fe3-0512-40e5-bce4-5cb8e182664d","Type":"ContainerDied","Data":"b5cc00b2a62bf8b551efcd4ca48fe56eb34d7ac2ce41eaf49cceecb283aedcf2"} Mar 08 00:52:07.964263 master-0 kubenswrapper[23041]: I0308 00:52:07.963984 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wtgz7" podStartSLOduration=2.946736296 podStartE2EDuration="15.963964821s" podCreationTimestamp="2026-03-08 00:51:52 +0000 UTC" firstStartedPulling="2026-03-08 00:51:53.267866245 +0000 UTC m=+1218.740702799" lastFinishedPulling="2026-03-08 00:52:06.28509477 +0000 UTC m=+1231.757931324" observedRunningTime="2026-03-08 00:52:07.956101429 +0000 UTC m=+1233.428937983" watchObservedRunningTime="2026-03-08 00:52:07.963964821 +0000 UTC m=+1233.436801375" Mar 08 00:52:08.936875 master-0 kubenswrapper[23041]: I0308 00:52:08.936818 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"4c820928544a984f42a08e5d4b4dd4f48930f79a51072c6caed27a80d21b98e2"} Mar 08 00:52:08.936875 master-0 kubenswrapper[23041]: I0308 00:52:08.936873 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"afd76cd95872c4e4c1d3c2360a3c1c80048fc21a590d096faabda08247bc686e"} Mar 08 00:52:08.944604 master-0 kubenswrapper[23041]: I0308 00:52:08.936886 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"11aaa6b5ae65b3a4ed53ba785f877520abbda85dab24069b58c797f751cb782b"} Mar 08 00:52:09.266258 master-0 kubenswrapper[23041]: I0308 00:52:09.264936 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8nzj6" Mar 08 00:52:09.517027 master-0 kubenswrapper[23041]: I0308 00:52:09.516967 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-468nq" Mar 08 00:52:09.525122 master-0 kubenswrapper[23041]: I0308 00:52:09.525034 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677449 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677555 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbkv\" (UniqueName: \"kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv\") pod \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677613 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677695 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677726 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677760 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts\") pod \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\" (UID: \"87454c8d-819b-4ee6-8291-ccdf7c81f77b\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677831 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run" (OuterVolumeSpecName: "var-run") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677878 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677920 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bc5w\" (UniqueName: \"kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.677960 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts\") pod \"484a3fe3-0512-40e5-bce4-5cb8e182664d\" (UID: \"484a3fe3-0512-40e5-bce4-5cb8e182664d\") " Mar 08 00:52:09.678506 master-0 kubenswrapper[23041]: I0308 00:52:09.678224 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.678580 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87454c8d-819b-4ee6-8291-ccdf7c81f77b" (UID: "87454c8d-819b-4ee6-8291-ccdf7c81f77b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.678656 23041 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.678682 23041 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.678695 23041 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484a3fe3-0512-40e5-bce4-5cb8e182664d-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.678877 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:09.679196 master-0 kubenswrapper[23041]: I0308 00:52:09.679063 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts" (OuterVolumeSpecName: "scripts") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:09.682349 master-0 kubenswrapper[23041]: I0308 00:52:09.682314 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv" (OuterVolumeSpecName: "kube-api-access-thbkv") pod "87454c8d-819b-4ee6-8291-ccdf7c81f77b" (UID: "87454c8d-819b-4ee6-8291-ccdf7c81f77b"). InnerVolumeSpecName "kube-api-access-thbkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:09.688596 master-0 kubenswrapper[23041]: I0308 00:52:09.687894 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w" (OuterVolumeSpecName: "kube-api-access-6bc5w") pod "484a3fe3-0512-40e5-bce4-5cb8e182664d" (UID: "484a3fe3-0512-40e5-bce4-5cb8e182664d"). InnerVolumeSpecName "kube-api-access-6bc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:09.783311 master-0 kubenswrapper[23041]: I0308 00:52:09.780289 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bc5w\" (UniqueName: \"kubernetes.io/projected/484a3fe3-0512-40e5-bce4-5cb8e182664d-kube-api-access-6bc5w\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.783311 master-0 kubenswrapper[23041]: I0308 00:52:09.780352 23041 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.783311 master-0 kubenswrapper[23041]: I0308 00:52:09.780382 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbkv\" (UniqueName: \"kubernetes.io/projected/87454c8d-819b-4ee6-8291-ccdf7c81f77b-kube-api-access-thbkv\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.783311 master-0 kubenswrapper[23041]: I0308 00:52:09.780397 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484a3fe3-0512-40e5-bce4-5cb8e182664d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.783311 master-0 kubenswrapper[23041]: I0308 00:52:09.780413 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87454c8d-819b-4ee6-8291-ccdf7c81f77b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:09.948402 master-0 kubenswrapper[23041]: I0308 00:52:09.948346 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8nzj6-config-kblpr" event={"ID":"484a3fe3-0512-40e5-bce4-5cb8e182664d","Type":"ContainerDied","Data":"10417d6760c62878b3b47624169db55c76afb67130c7f4119a65d33a20f81c40"} Mar 08 00:52:09.948905 master-0 kubenswrapper[23041]: I0308 00:52:09.948424 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10417d6760c62878b3b47624169db55c76afb67130c7f4119a65d33a20f81c40" Mar 08 00:52:09.948905 master-0 kubenswrapper[23041]: I0308 00:52:09.948429 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8nzj6-config-kblpr" Mar 08 00:52:09.950801 master-0 kubenswrapper[23041]: I0308 00:52:09.950766 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-468nq" Mar 08 00:52:09.950949 master-0 kubenswrapper[23041]: I0308 00:52:09.950885 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-468nq" event={"ID":"87454c8d-819b-4ee6-8291-ccdf7c81f77b","Type":"ContainerDied","Data":"fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951"} Mar 08 00:52:09.950949 master-0 kubenswrapper[23041]: I0308 00:52:09.950919 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fae35e406b81edf21abf92f9ac0792b934361e9c0157b94f1386caba13e38951" Mar 08 00:52:09.954345 master-0 kubenswrapper[23041]: I0308 00:52:09.954305 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"e6752d789e064cedea941137be6e1a858657baad255acd300849aced243b2f0e"} Mar 08 00:52:10.675689 master-0 kubenswrapper[23041]: I0308 00:52:10.673118 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8nzj6-config-kblpr"] Mar 08 00:52:10.693532 master-0 kubenswrapper[23041]: I0308 00:52:10.693468 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8nzj6-config-kblpr"] Mar 08 00:52:10.828234 master-0 kubenswrapper[23041]: I0308 00:52:10.828149 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484a3fe3-0512-40e5-bce4-5cb8e182664d" path="/var/lib/kubelet/pods/484a3fe3-0512-40e5-bce4-5cb8e182664d/volumes" Mar 08 00:52:10.983610 master-0 kubenswrapper[23041]: I0308 00:52:10.983544 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"69b3f00b3269684512bae35dd047f201e10d425ad79e72c6a07343481e060e14"} Mar 08 00:52:10.983610 master-0 kubenswrapper[23041]: I0308 00:52:10.983602 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"36dd8660772e4b6edfe431f2a5d58ecc03c05fb6195851f073b3f0d70b833367"} Mar 08 00:52:10.983610 master-0 kubenswrapper[23041]: I0308 00:52:10.983619 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"2d0be07164674d4afb775992a62bb9a71ac67f07e84289c15ec64e2cbbe112e2"} Mar 08 00:52:12.005744 master-0 kubenswrapper[23041]: I0308 00:52:12.005691 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"8d8342fab97d8aaa7d3d93870e9f9f9084847ed9c8a621c32ee279ae930d3b19"} Mar 08 00:52:14.034936 master-0 kubenswrapper[23041]: I0308 00:52:14.034848 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"8dc068283c7b7aa0c90a19d3f11755d73fc980b96e2a1ce5f32d30faf2586cab"} Mar 08 00:52:14.035381 master-0 kubenswrapper[23041]: I0308 00:52:14.034949 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"bcb30603c19ecdd805523e4c20284dd888dc7373c3a05eddaa04beac05afd511"} Mar 08 00:52:14.035381 master-0 kubenswrapper[23041]: I0308 00:52:14.034963 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"8bbcd4acd59c652196d1cb64148ac7925f6f184ad927641c7179393dbf74212b"} Mar 08 00:52:14.054533 master-0 kubenswrapper[23041]: I0308 00:52:14.054468 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 00:52:15.054567 master-0 kubenswrapper[23041]: I0308 00:52:15.053480 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"86423872ba1037e38020db9b60d763ea9ccc29f1b992b3e48b76f9a16c510f88"} Mar 08 00:52:15.054567 master-0 kubenswrapper[23041]: I0308 00:52:15.053535 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"776eaf52f7564a41f59f42387464c7098f291141c5c8813c572bf6275599c2b2"} Mar 08 00:52:15.057454 master-0 kubenswrapper[23041]: I0308 00:52:15.053546 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"97b5f18296a97b48ded5cbabbc94dee2c5efabd1e449723a198d5f10070d84be"} Mar 08 00:52:15.668466 master-0 kubenswrapper[23041]: I0308 00:52:15.668401 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 00:52:16.071558 master-0 kubenswrapper[23041]: I0308 00:52:16.071505 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"724cd646-a717-4778-82a6-9471c70e13c5","Type":"ContainerStarted","Data":"870161903a73e4d37a20b9a004f38b63668f41d0ef0d132026122d7400a4f5d3"} Mar 08 00:52:16.400162 master-0 kubenswrapper[23041]: I0308 00:52:16.397345 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.5340064 podStartE2EDuration="44.397320332s" podCreationTimestamp="2026-03-08 00:51:32 +0000 UTC" firstStartedPulling="2026-03-08 00:52:07.029148286 +0000 UTC m=+1232.501984840" lastFinishedPulling="2026-03-08 00:52:12.892462218 +0000 UTC m=+1238.365298772" observedRunningTime="2026-03-08 00:52:16.127125315 +0000 UTC m=+1241.599961889" watchObservedRunningTime="2026-03-08 00:52:16.397320332 +0000 UTC m=+1241.870156886" Mar 08 00:52:16.409259 master-0 kubenswrapper[23041]: I0308 00:52:16.408507 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:16.409259 master-0 kubenswrapper[23041]: E0308 00:52:16.409135 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484a3fe3-0512-40e5-bce4-5cb8e182664d" containerName="ovn-config" Mar 08 00:52:16.409259 master-0 kubenswrapper[23041]: I0308 00:52:16.409158 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="484a3fe3-0512-40e5-bce4-5cb8e182664d" containerName="ovn-config" Mar 08 00:52:16.409259 master-0 kubenswrapper[23041]: E0308 00:52:16.409180 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87454c8d-819b-4ee6-8291-ccdf7c81f77b" containerName="mariadb-account-create-update" Mar 08 00:52:16.409259 master-0 kubenswrapper[23041]: I0308 00:52:16.409189 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="87454c8d-819b-4ee6-8291-ccdf7c81f77b" containerName="mariadb-account-create-update" Mar 08 00:52:16.409678 master-0 kubenswrapper[23041]: I0308 00:52:16.409511 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="87454c8d-819b-4ee6-8291-ccdf7c81f77b" containerName="mariadb-account-create-update" Mar 08 00:52:16.409678 master-0 kubenswrapper[23041]: I0308 00:52:16.409564 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="484a3fe3-0512-40e5-bce4-5cb8e182664d" containerName="ovn-config" Mar 08 00:52:16.412451 master-0 kubenswrapper[23041]: I0308 00:52:16.410938 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.421252 master-0 kubenswrapper[23041]: I0308 00:52:16.418616 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 00:52:16.443175 master-0 kubenswrapper[23041]: I0308 00:52:16.437946 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:16.464151 master-0 kubenswrapper[23041]: I0308 00:52:16.464082 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.464151 master-0 kubenswrapper[23041]: I0308 00:52:16.464125 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.464151 master-0 kubenswrapper[23041]: I0308 00:52:16.464153 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.464508 master-0 kubenswrapper[23041]: I0308 00:52:16.464232 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjjv\" (UniqueName: \"kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.464692 master-0 kubenswrapper[23041]: I0308 00:52:16.464653 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.464755 master-0 kubenswrapper[23041]: I0308 00:52:16.464690 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567547 master-0 kubenswrapper[23041]: I0308 00:52:16.567497 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjjv\" (UniqueName: \"kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567778 master-0 kubenswrapper[23041]: I0308 00:52:16.567559 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567778 master-0 kubenswrapper[23041]: I0308 00:52:16.567580 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567778 master-0 kubenswrapper[23041]: I0308 00:52:16.567666 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567778 master-0 kubenswrapper[23041]: I0308 00:52:16.567686 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.567778 master-0 kubenswrapper[23041]: I0308 00:52:16.567708 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.568872 master-0 kubenswrapper[23041]: I0308 00:52:16.568834 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.569436 master-0 kubenswrapper[23041]: I0308 00:52:16.569366 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.569517 master-0 kubenswrapper[23041]: I0308 00:52:16.569477 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.569617 master-0 kubenswrapper[23041]: I0308 00:52:16.569566 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.570042 master-0 kubenswrapper[23041]: I0308 00:52:16.570016 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.589856 master-0 kubenswrapper[23041]: I0308 00:52:16.589497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjjv\" (UniqueName: \"kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv\") pod \"dnsmasq-dns-dd6667767-7bv69\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.737287 master-0 kubenswrapper[23041]: I0308 00:52:16.737132 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h7pn9"] Mar 08 00:52:16.738960 master-0 kubenswrapper[23041]: I0308 00:52:16.738935 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:16.764030 master-0 kubenswrapper[23041]: I0308 00:52:16.763913 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:16.797392 master-0 kubenswrapper[23041]: I0308 00:52:16.797330 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h7pn9"] Mar 08 00:52:16.880069 master-0 kubenswrapper[23041]: I0308 00:52:16.877391 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:16.880069 master-0 kubenswrapper[23041]: I0308 00:52:16.877492 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mb2\" (UniqueName: \"kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:16.883635 master-0 kubenswrapper[23041]: I0308 00:52:16.883506 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f0a6-account-create-update-7g79m"] Mar 08 00:52:16.892536 master-0 kubenswrapper[23041]: I0308 00:52:16.892458 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:16.905322 master-0 kubenswrapper[23041]: I0308 00:52:16.903856 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 00:52:16.913658 master-0 kubenswrapper[23041]: I0308 00:52:16.913593 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f0a6-account-create-update-7g79m"] Mar 08 00:52:16.984026 master-0 kubenswrapper[23041]: I0308 00:52:16.983086 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:16.984026 master-0 kubenswrapper[23041]: I0308 00:52:16.983156 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:16.984026 master-0 kubenswrapper[23041]: I0308 00:52:16.983178 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mb2\" (UniqueName: \"kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:16.984026 master-0 kubenswrapper[23041]: I0308 00:52:16.983314 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmzpt\" (UniqueName: \"kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:16.985291 master-0 kubenswrapper[23041]: I0308 00:52:16.985217 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:17.013327 master-0 kubenswrapper[23041]: I0308 00:52:17.007953 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mb2\" (UniqueName: \"kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2\") pod \"cinder-db-create-h7pn9\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:17.063482 master-0 kubenswrapper[23041]: I0308 00:52:17.062025 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:17.086653 master-0 kubenswrapper[23041]: I0308 00:52:17.086578 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:17.087167 master-0 kubenswrapper[23041]: I0308 00:52:17.086714 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmzpt\" (UniqueName: \"kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:17.091274 master-0 kubenswrapper[23041]: I0308 00:52:17.091222 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:17.100495 master-0 kubenswrapper[23041]: I0308 00:52:17.100417 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4vctw"] Mar 08 00:52:17.102352 master-0 kubenswrapper[23041]: I0308 00:52:17.102259 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.114577 master-0 kubenswrapper[23041]: I0308 00:52:17.111958 23041 generic.go:334] "Generic (PLEG): container finished" podID="32cca4f7-a751-48d2-b93f-211bb7f12697" containerID="3f6fabd754937090775079da9396a4489b34d8439d2e4928f0a252790ba96dea" exitCode=0 Mar 08 00:52:17.114577 master-0 kubenswrapper[23041]: I0308 00:52:17.114419 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtgz7" event={"ID":"32cca4f7-a751-48d2-b93f-211bb7f12697","Type":"ContainerDied","Data":"3f6fabd754937090775079da9396a4489b34d8439d2e4928f0a252790ba96dea"} Mar 08 00:52:17.127252 master-0 kubenswrapper[23041]: I0308 00:52:17.125726 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmzpt\" (UniqueName: \"kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt\") pod \"cinder-f0a6-account-create-update-7g79m\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:17.148935 master-0 kubenswrapper[23041]: I0308 00:52:17.146025 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4vctw"] Mar 08 00:52:17.180243 master-0 kubenswrapper[23041]: I0308 00:52:17.179670 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mz8x8"] Mar 08 00:52:17.181400 master-0 kubenswrapper[23041]: I0308 00:52:17.181349 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.184755 master-0 kubenswrapper[23041]: I0308 00:52:17.184710 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 00:52:17.185222 master-0 kubenswrapper[23041]: I0308 00:52:17.185188 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 00:52:17.191377 master-0 kubenswrapper[23041]: I0308 00:52:17.185523 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 00:52:17.198842 master-0 kubenswrapper[23041]: I0308 00:52:17.198143 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ab9f-account-create-update-wxwc6"] Mar 08 00:52:17.209246 master-0 kubenswrapper[23041]: I0308 00:52:17.207061 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.211418 master-0 kubenswrapper[23041]: I0308 00:52:17.211369 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 00:52:17.239307 master-0 kubenswrapper[23041]: I0308 00:52:17.235039 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:17.290856 master-0 kubenswrapper[23041]: I0308 00:52:17.290779 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ab9f-account-create-update-wxwc6"] Mar 08 00:52:17.292260 master-0 kubenswrapper[23041]: I0308 00:52:17.292195 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2w7k\" (UniqueName: \"kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.292458 master-0 kubenswrapper[23041]: I0308 00:52:17.292435 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghm2\" (UniqueName: \"kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.292607 master-0 kubenswrapper[23041]: I0308 00:52:17.292583 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.292748 master-0 kubenswrapper[23041]: I0308 00:52:17.292729 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.292987 master-0 kubenswrapper[23041]: I0308 00:52:17.292942 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.301871 master-0 kubenswrapper[23041]: I0308 00:52:17.301595 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mz8x8"] Mar 08 00:52:17.338942 master-0 kubenswrapper[23041]: I0308 00:52:17.338852 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.398587 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2w7k\" (UniqueName: \"kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.398679 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghm2\" (UniqueName: \"kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.398739 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.398811 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.398928 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.399054 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22t8f\" (UniqueName: \"kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.399351 master-0 kubenswrapper[23041]: I0308 00:52:17.399089 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.405889 master-0 kubenswrapper[23041]: I0308 00:52:17.405803 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.407322 master-0 kubenswrapper[23041]: I0308 00:52:17.407011 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.407642 master-0 kubenswrapper[23041]: I0308 00:52:17.407575 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.417099 master-0 kubenswrapper[23041]: I0308 00:52:17.416774 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2w7k\" (UniqueName: \"kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k\") pod \"keystone-db-sync-mz8x8\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.421496 master-0 kubenswrapper[23041]: I0308 00:52:17.421422 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghm2\" (UniqueName: \"kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2\") pod \"neutron-db-create-4vctw\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.502913 master-0 kubenswrapper[23041]: I0308 00:52:17.502840 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22t8f\" (UniqueName: \"kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.502913 master-0 kubenswrapper[23041]: I0308 00:52:17.502908 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.504493 master-0 kubenswrapper[23041]: I0308 00:52:17.504224 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.524559 master-0 kubenswrapper[23041]: I0308 00:52:17.524512 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22t8f\" (UniqueName: \"kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f\") pod \"neutron-ab9f-account-create-update-wxwc6\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.531662 master-0 kubenswrapper[23041]: I0308 00:52:17.531593 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:17.696041 master-0 kubenswrapper[23041]: I0308 00:52:17.691541 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:17.696041 master-0 kubenswrapper[23041]: I0308 00:52:17.695049 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:17.703154 master-0 kubenswrapper[23041]: I0308 00:52:17.700277 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h7pn9"] Mar 08 00:52:17.798274 master-0 kubenswrapper[23041]: I0308 00:52:17.795780 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f0a6-account-create-update-7g79m"] Mar 08 00:52:18.027966 master-0 kubenswrapper[23041]: I0308 00:52:18.027899 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4vctw"] Mar 08 00:52:18.052151 master-0 kubenswrapper[23041]: W0308 00:52:18.052091 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7ecee5_d829_4f07_a4c3_ef6cf98b6519.slice/crio-5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b WatchSource:0}: Error finding container 5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b: Status 404 returned error can't find the container with id 5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b Mar 08 00:52:18.124888 master-0 kubenswrapper[23041]: I0308 00:52:18.124801 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0a6-account-create-update-7g79m" event={"ID":"cd49f4f1-8a3f-4c73-a2b1-65f552d39926","Type":"ContainerStarted","Data":"b6a4782c1efb8d5c92a22ecd80e85dfc1a2aeb2e873ed3394e56fb53c87985a3"} Mar 08 00:52:18.124888 master-0 kubenswrapper[23041]: I0308 00:52:18.124865 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0a6-account-create-update-7g79m" event={"ID":"cd49f4f1-8a3f-4c73-a2b1-65f552d39926","Type":"ContainerStarted","Data":"781a2075b044676462c7c72e3e078e2a482435d350c421c8f40e07619fee8641"} Mar 08 00:52:18.127977 master-0 kubenswrapper[23041]: I0308 00:52:18.127918 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h7pn9" event={"ID":"49a404f8-c225-48fb-987f-e77945275680","Type":"ContainerStarted","Data":"91336a7ee881d19c4cd2e5b5eca4b0b8dc5cdd87f2f7c7c25769785127726d01"} Mar 08 00:52:18.127977 master-0 kubenswrapper[23041]: I0308 00:52:18.127967 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h7pn9" event={"ID":"49a404f8-c225-48fb-987f-e77945275680","Type":"ContainerStarted","Data":"31420e70f5b5a973375a9e8f9c5de6f3caff606f49abaad878c30cfeffbf5586"} Mar 08 00:52:18.130537 master-0 kubenswrapper[23041]: I0308 00:52:18.130451 23041 generic.go:334] "Generic (PLEG): container finished" podID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerID="bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0" exitCode=0 Mar 08 00:52:18.130537 master-0 kubenswrapper[23041]: I0308 00:52:18.130508 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd6667767-7bv69" event={"ID":"0ba67633-35bb-450a-aeaa-9fb88d428e39","Type":"ContainerDied","Data":"bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0"} Mar 08 00:52:18.130537 master-0 kubenswrapper[23041]: I0308 00:52:18.130533 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd6667767-7bv69" event={"ID":"0ba67633-35bb-450a-aeaa-9fb88d428e39","Type":"ContainerStarted","Data":"9c1da92e742101a9ee8f7cbf67325455678f35af1f0f289d913f0055a86a803a"} Mar 08 00:52:18.132992 master-0 kubenswrapper[23041]: I0308 00:52:18.132799 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vctw" event={"ID":"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519","Type":"ContainerStarted","Data":"5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b"} Mar 08 00:52:18.169287 master-0 kubenswrapper[23041]: I0308 00:52:18.166488 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f0a6-account-create-update-7g79m" podStartSLOduration=2.166462538 podStartE2EDuration="2.166462538s" podCreationTimestamp="2026-03-08 00:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:18.142709978 +0000 UTC m=+1243.615546542" watchObservedRunningTime="2026-03-08 00:52:18.166462538 +0000 UTC m=+1243.639299092" Mar 08 00:52:18.224720 master-0 kubenswrapper[23041]: I0308 00:52:18.224606 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-h7pn9" podStartSLOduration=2.224577797 podStartE2EDuration="2.224577797s" podCreationTimestamp="2026-03-08 00:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:18.205311047 +0000 UTC m=+1243.678147601" watchObservedRunningTime="2026-03-08 00:52:18.224577797 +0000 UTC m=+1243.697414341" Mar 08 00:52:18.334740 master-0 kubenswrapper[23041]: I0308 00:52:18.334649 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mz8x8"] Mar 08 00:52:18.353498 master-0 kubenswrapper[23041]: I0308 00:52:18.353414 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ab9f-account-create-update-wxwc6"] Mar 08 00:52:18.402791 master-0 kubenswrapper[23041]: W0308 00:52:18.402720 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f976074_a681_4867_96f3_089f7cfabd9e.slice/crio-170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d WatchSource:0}: Error finding container 170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d: Status 404 returned error can't find the container with id 170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d Mar 08 00:52:18.403168 master-0 kubenswrapper[23041]: W0308 00:52:18.403103 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274fc42b_c842_4a84_b407_2e9bd971a75b.slice/crio-139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6 WatchSource:0}: Error finding container 139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6: Status 404 returned error can't find the container with id 139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6 Mar 08 00:52:18.634269 master-0 kubenswrapper[23041]: I0308 00:52:18.633319 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtgz7" Mar 08 00:52:18.756164 master-0 kubenswrapper[23041]: I0308 00:52:18.756093 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data\") pod \"32cca4f7-a751-48d2-b93f-211bb7f12697\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " Mar 08 00:52:18.756164 master-0 kubenswrapper[23041]: I0308 00:52:18.756158 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prs5n\" (UniqueName: \"kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n\") pod \"32cca4f7-a751-48d2-b93f-211bb7f12697\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " Mar 08 00:52:18.756497 master-0 kubenswrapper[23041]: I0308 00:52:18.756340 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle\") pod \"32cca4f7-a751-48d2-b93f-211bb7f12697\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " Mar 08 00:52:18.756497 master-0 kubenswrapper[23041]: I0308 00:52:18.756387 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data\") pod \"32cca4f7-a751-48d2-b93f-211bb7f12697\" (UID: \"32cca4f7-a751-48d2-b93f-211bb7f12697\") " Mar 08 00:52:18.760759 master-0 kubenswrapper[23041]: I0308 00:52:18.760710 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n" (OuterVolumeSpecName: "kube-api-access-prs5n") pod "32cca4f7-a751-48d2-b93f-211bb7f12697" (UID: "32cca4f7-a751-48d2-b93f-211bb7f12697"). InnerVolumeSpecName "kube-api-access-prs5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:18.761043 master-0 kubenswrapper[23041]: I0308 00:52:18.761009 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "32cca4f7-a751-48d2-b93f-211bb7f12697" (UID: "32cca4f7-a751-48d2-b93f-211bb7f12697"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:18.790819 master-0 kubenswrapper[23041]: I0308 00:52:18.790740 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32cca4f7-a751-48d2-b93f-211bb7f12697" (UID: "32cca4f7-a751-48d2-b93f-211bb7f12697"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:18.815912 master-0 kubenswrapper[23041]: I0308 00:52:18.813739 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data" (OuterVolumeSpecName: "config-data") pod "32cca4f7-a751-48d2-b93f-211bb7f12697" (UID: "32cca4f7-a751-48d2-b93f-211bb7f12697"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:18.859990 master-0 kubenswrapper[23041]: I0308 00:52:18.858581 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:18.859990 master-0 kubenswrapper[23041]: I0308 00:52:18.858634 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prs5n\" (UniqueName: \"kubernetes.io/projected/32cca4f7-a751-48d2-b93f-211bb7f12697-kube-api-access-prs5n\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:18.859990 master-0 kubenswrapper[23041]: I0308 00:52:18.858648 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:18.859990 master-0 kubenswrapper[23041]: I0308 00:52:18.858657 23041 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32cca4f7-a751-48d2-b93f-211bb7f12697-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:19.147539 master-0 kubenswrapper[23041]: I0308 00:52:19.147470 23041 generic.go:334] "Generic (PLEG): container finished" podID="49a404f8-c225-48fb-987f-e77945275680" containerID="91336a7ee881d19c4cd2e5b5eca4b0b8dc5cdd87f2f7c7c25769785127726d01" exitCode=0 Mar 08 00:52:19.148697 master-0 kubenswrapper[23041]: I0308 00:52:19.147983 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h7pn9" event={"ID":"49a404f8-c225-48fb-987f-e77945275680","Type":"ContainerDied","Data":"91336a7ee881d19c4cd2e5b5eca4b0b8dc5cdd87f2f7c7c25769785127726d01"} Mar 08 00:52:19.155616 master-0 kubenswrapper[23041]: I0308 00:52:19.155521 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd6667767-7bv69" event={"ID":"0ba67633-35bb-450a-aeaa-9fb88d428e39","Type":"ContainerStarted","Data":"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d"} Mar 08 00:52:19.155820 master-0 kubenswrapper[23041]: I0308 00:52:19.155667 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:19.157968 master-0 kubenswrapper[23041]: I0308 00:52:19.157904 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtgz7" event={"ID":"32cca4f7-a751-48d2-b93f-211bb7f12697","Type":"ContainerDied","Data":"5d57ab35c9d1f89da60ae57586f8f57b0f5f3929d5606a0c7bc50c7126d40b9d"} Mar 08 00:52:19.157968 master-0 kubenswrapper[23041]: I0308 00:52:19.157958 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d57ab35c9d1f89da60ae57586f8f57b0f5f3929d5606a0c7bc50c7126d40b9d" Mar 08 00:52:19.158116 master-0 kubenswrapper[23041]: I0308 00:52:19.158020 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtgz7" Mar 08 00:52:19.161716 master-0 kubenswrapper[23041]: I0308 00:52:19.161664 23041 generic.go:334] "Generic (PLEG): container finished" podID="ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" containerID="3c102884ffbcff7f27aee8088a4eecd249dc5ece2fb3e752177de26423658859" exitCode=0 Mar 08 00:52:19.161800 master-0 kubenswrapper[23041]: I0308 00:52:19.161741 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vctw" event={"ID":"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519","Type":"ContainerDied","Data":"3c102884ffbcff7f27aee8088a4eecd249dc5ece2fb3e752177de26423658859"} Mar 08 00:52:19.164559 master-0 kubenswrapper[23041]: I0308 00:52:19.163765 23041 generic.go:334] "Generic (PLEG): container finished" podID="cd49f4f1-8a3f-4c73-a2b1-65f552d39926" containerID="b6a4782c1efb8d5c92a22ecd80e85dfc1a2aeb2e873ed3394e56fb53c87985a3" exitCode=0 Mar 08 00:52:19.164559 master-0 kubenswrapper[23041]: I0308 00:52:19.163816 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0a6-account-create-update-7g79m" event={"ID":"cd49f4f1-8a3f-4c73-a2b1-65f552d39926","Type":"ContainerDied","Data":"b6a4782c1efb8d5c92a22ecd80e85dfc1a2aeb2e873ed3394e56fb53c87985a3"} Mar 08 00:52:19.165638 master-0 kubenswrapper[23041]: I0308 00:52:19.165593 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mz8x8" event={"ID":"274fc42b-c842-4a84-b407-2e9bd971a75b","Type":"ContainerStarted","Data":"139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6"} Mar 08 00:52:19.168406 master-0 kubenswrapper[23041]: I0308 00:52:19.168362 23041 generic.go:334] "Generic (PLEG): container finished" podID="1f976074-a681-4867-96f3-089f7cfabd9e" containerID="9d1747541108b14e4d824687bf46211f82014e5951617bbf0ae29b1a155f3a38" exitCode=0 Mar 08 00:52:19.168503 master-0 kubenswrapper[23041]: I0308 00:52:19.168423 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ab9f-account-create-update-wxwc6" event={"ID":"1f976074-a681-4867-96f3-089f7cfabd9e","Type":"ContainerDied","Data":"9d1747541108b14e4d824687bf46211f82014e5951617bbf0ae29b1a155f3a38"} Mar 08 00:52:19.168503 master-0 kubenswrapper[23041]: I0308 00:52:19.168451 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ab9f-account-create-update-wxwc6" event={"ID":"1f976074-a681-4867-96f3-089f7cfabd9e","Type":"ContainerStarted","Data":"170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d"} Mar 08 00:52:19.256721 master-0 kubenswrapper[23041]: I0308 00:52:19.256645 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dd6667767-7bv69" podStartSLOduration=3.256617376 podStartE2EDuration="3.256617376s" podCreationTimestamp="2026-03-08 00:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:19.24737836 +0000 UTC m=+1244.720214924" watchObservedRunningTime="2026-03-08 00:52:19.256617376 +0000 UTC m=+1244.729453930" Mar 08 00:52:19.611142 master-0 kubenswrapper[23041]: I0308 00:52:19.611076 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:19.692285 master-0 kubenswrapper[23041]: I0308 00:52:19.689573 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:19.692285 master-0 kubenswrapper[23041]: E0308 00:52:19.690179 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cca4f7-a751-48d2-b93f-211bb7f12697" containerName="glance-db-sync" Mar 08 00:52:19.692285 master-0 kubenswrapper[23041]: I0308 00:52:19.690209 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cca4f7-a751-48d2-b93f-211bb7f12697" containerName="glance-db-sync" Mar 08 00:52:19.692285 master-0 kubenswrapper[23041]: I0308 00:52:19.690508 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="32cca4f7-a751-48d2-b93f-211bb7f12697" containerName="glance-db-sync" Mar 08 00:52:19.693774 master-0 kubenswrapper[23041]: I0308 00:52:19.692897 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.715295 master-0 kubenswrapper[23041]: I0308 00:52:19.714735 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:19.800727 master-0 kubenswrapper[23041]: I0308 00:52:19.800507 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.800727 master-0 kubenswrapper[23041]: I0308 00:52:19.800637 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.800727 master-0 kubenswrapper[23041]: I0308 00:52:19.800679 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmgpt\" (UniqueName: \"kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.801000 master-0 kubenswrapper[23041]: I0308 00:52:19.800740 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.801031 master-0 kubenswrapper[23041]: I0308 00:52:19.800994 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.801117 master-0 kubenswrapper[23041]: I0308 00:52:19.801089 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.903537 master-0 kubenswrapper[23041]: I0308 00:52:19.903477 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.903761 master-0 kubenswrapper[23041]: I0308 00:52:19.903551 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmgpt\" (UniqueName: \"kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.903921 master-0 kubenswrapper[23041]: I0308 00:52:19.903886 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.904018 master-0 kubenswrapper[23041]: I0308 00:52:19.903975 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.904090 master-0 kubenswrapper[23041]: I0308 00:52:19.904022 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.904169 master-0 kubenswrapper[23041]: I0308 00:52:19.904141 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.904827 master-0 kubenswrapper[23041]: I0308 00:52:19.904748 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.906705 master-0 kubenswrapper[23041]: I0308 00:52:19.905079 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.906705 master-0 kubenswrapper[23041]: I0308 00:52:19.905079 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.906705 master-0 kubenswrapper[23041]: I0308 00:52:19.905185 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.906705 master-0 kubenswrapper[23041]: I0308 00:52:19.905495 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:19.923219 master-0 kubenswrapper[23041]: I0308 00:52:19.920939 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmgpt\" (UniqueName: \"kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt\") pod \"dnsmasq-dns-89fcc4dcf-gml6g\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:20.038640 master-0 kubenswrapper[23041]: I0308 00:52:20.038586 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:20.872844 master-0 kubenswrapper[23041]: I0308 00:52:20.872787 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:20.936837 master-0 kubenswrapper[23041]: I0308 00:52:20.936734 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:21.047829 master-0 kubenswrapper[23041]: I0308 00:52:21.047772 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts\") pod \"1f976074-a681-4867-96f3-089f7cfabd9e\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " Mar 08 00:52:21.048022 master-0 kubenswrapper[23041]: I0308 00:52:21.047861 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22t8f\" (UniqueName: \"kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f\") pod \"1f976074-a681-4867-96f3-089f7cfabd9e\" (UID: \"1f976074-a681-4867-96f3-089f7cfabd9e\") " Mar 08 00:52:21.049448 master-0 kubenswrapper[23041]: I0308 00:52:21.049411 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f976074-a681-4867-96f3-089f7cfabd9e" (UID: "1f976074-a681-4867-96f3-089f7cfabd9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.052278 master-0 kubenswrapper[23041]: I0308 00:52:21.052244 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f" (OuterVolumeSpecName: "kube-api-access-22t8f") pod "1f976074-a681-4867-96f3-089f7cfabd9e" (UID: "1f976074-a681-4867-96f3-089f7cfabd9e"). InnerVolumeSpecName "kube-api-access-22t8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:21.053610 master-0 kubenswrapper[23041]: I0308 00:52:21.053581 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:21.062303 master-0 kubenswrapper[23041]: I0308 00:52:21.061522 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:21.107552 master-0 kubenswrapper[23041]: I0308 00:52:21.107497 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:21.156838 master-0 kubenswrapper[23041]: I0308 00:52:21.156731 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lghm2\" (UniqueName: \"kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2\") pod \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " Mar 08 00:52:21.157094 master-0 kubenswrapper[23041]: I0308 00:52:21.156882 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6mb2\" (UniqueName: \"kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2\") pod \"49a404f8-c225-48fb-987f-e77945275680\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " Mar 08 00:52:21.157094 master-0 kubenswrapper[23041]: I0308 00:52:21.156958 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts\") pod \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\" (UID: \"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519\") " Mar 08 00:52:21.157094 master-0 kubenswrapper[23041]: I0308 00:52:21.157009 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts\") pod \"49a404f8-c225-48fb-987f-e77945275680\" (UID: \"49a404f8-c225-48fb-987f-e77945275680\") " Mar 08 00:52:21.157536 master-0 kubenswrapper[23041]: I0308 00:52:21.157489 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f976074-a681-4867-96f3-089f7cfabd9e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.157536 master-0 kubenswrapper[23041]: I0308 00:52:21.157509 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22t8f\" (UniqueName: \"kubernetes.io/projected/1f976074-a681-4867-96f3-089f7cfabd9e-kube-api-access-22t8f\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.163040 master-0 kubenswrapper[23041]: I0308 00:52:21.162936 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" (UID: "ca7ecee5-d829-4f07-a4c3-ef6cf98b6519"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.167455 master-0 kubenswrapper[23041]: I0308 00:52:21.167396 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2" (OuterVolumeSpecName: "kube-api-access-c6mb2") pod "49a404f8-c225-48fb-987f-e77945275680" (UID: "49a404f8-c225-48fb-987f-e77945275680"). InnerVolumeSpecName "kube-api-access-c6mb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:21.168876 master-0 kubenswrapper[23041]: I0308 00:52:21.168825 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2" (OuterVolumeSpecName: "kube-api-access-lghm2") pod "ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" (UID: "ca7ecee5-d829-4f07-a4c3-ef6cf98b6519"). InnerVolumeSpecName "kube-api-access-lghm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:21.184036 master-0 kubenswrapper[23041]: I0308 00:52:21.183919 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49a404f8-c225-48fb-987f-e77945275680" (UID: "49a404f8-c225-48fb-987f-e77945275680"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.213507 master-0 kubenswrapper[23041]: I0308 00:52:21.213138 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0a6-account-create-update-7g79m" event={"ID":"cd49f4f1-8a3f-4c73-a2b1-65f552d39926","Type":"ContainerDied","Data":"781a2075b044676462c7c72e3e078e2a482435d350c421c8f40e07619fee8641"} Mar 08 00:52:21.213507 master-0 kubenswrapper[23041]: I0308 00:52:21.213249 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="781a2075b044676462c7c72e3e078e2a482435d350c421c8f40e07619fee8641" Mar 08 00:52:21.213507 master-0 kubenswrapper[23041]: I0308 00:52:21.213313 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0a6-account-create-update-7g79m" Mar 08 00:52:21.216692 master-0 kubenswrapper[23041]: I0308 00:52:21.216651 23041 generic.go:334] "Generic (PLEG): container finished" podID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerID="89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131" exitCode=0 Mar 08 00:52:21.216749 master-0 kubenswrapper[23041]: I0308 00:52:21.216719 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" event={"ID":"98c1faca-b20d-4243-a40b-da58d311ddf6","Type":"ContainerDied","Data":"89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131"} Mar 08 00:52:21.216749 master-0 kubenswrapper[23041]: I0308 00:52:21.216746 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" event={"ID":"98c1faca-b20d-4243-a40b-da58d311ddf6","Type":"ContainerStarted","Data":"031278e5ce1e1295c7dcbbdb4b3968d81cf0909d7a54bbdebee8de642d5d0308"} Mar 08 00:52:21.225690 master-0 kubenswrapper[23041]: I0308 00:52:21.225650 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ab9f-account-create-update-wxwc6" Mar 08 00:52:21.226444 master-0 kubenswrapper[23041]: I0308 00:52:21.226399 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ab9f-account-create-update-wxwc6" event={"ID":"1f976074-a681-4867-96f3-089f7cfabd9e","Type":"ContainerDied","Data":"170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d"} Mar 08 00:52:21.226502 master-0 kubenswrapper[23041]: I0308 00:52:21.226476 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170cb51a874550ae4777930e9ebd70d98f7c9b71fce2668c9df3196d8df4581d" Mar 08 00:52:21.237322 master-0 kubenswrapper[23041]: I0308 00:52:21.228056 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h7pn9" Mar 08 00:52:21.237322 master-0 kubenswrapper[23041]: I0308 00:52:21.228156 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h7pn9" event={"ID":"49a404f8-c225-48fb-987f-e77945275680","Type":"ContainerDied","Data":"31420e70f5b5a973375a9e8f9c5de6f3caff606f49abaad878c30cfeffbf5586"} Mar 08 00:52:21.237322 master-0 kubenswrapper[23041]: I0308 00:52:21.228176 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31420e70f5b5a973375a9e8f9c5de6f3caff606f49abaad878c30cfeffbf5586" Mar 08 00:52:21.237322 master-0 kubenswrapper[23041]: I0308 00:52:21.231437 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dd6667767-7bv69" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="dnsmasq-dns" containerID="cri-o://e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d" gracePeriod=10 Mar 08 00:52:21.237322 master-0 kubenswrapper[23041]: I0308 00:52:21.231887 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vctw" Mar 08 00:52:21.255906 master-0 kubenswrapper[23041]: I0308 00:52:21.245789 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vctw" event={"ID":"ca7ecee5-d829-4f07-a4c3-ef6cf98b6519","Type":"ContainerDied","Data":"5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b"} Mar 08 00:52:21.255906 master-0 kubenswrapper[23041]: I0308 00:52:21.245880 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f9a992baa5cec957fb8b41cce859f96165f6b124c9e7866f365c569422af90b" Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.263065 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmzpt\" (UniqueName: \"kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt\") pod \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.263395 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts\") pod \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\" (UID: \"cd49f4f1-8a3f-4c73-a2b1-65f552d39926\") " Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.264769 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lghm2\" (UniqueName: \"kubernetes.io/projected/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-kube-api-access-lghm2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.264794 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6mb2\" (UniqueName: \"kubernetes.io/projected/49a404f8-c225-48fb-987f-e77945275680-kube-api-access-c6mb2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.264809 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.264840 master-0 kubenswrapper[23041]: I0308 00:52:21.264827 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a404f8-c225-48fb-987f-e77945275680-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.265701 master-0 kubenswrapper[23041]: I0308 00:52:21.265320 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd49f4f1-8a3f-4c73-a2b1-65f552d39926" (UID: "cd49f4f1-8a3f-4c73-a2b1-65f552d39926"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.286405 master-0 kubenswrapper[23041]: I0308 00:52:21.285997 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt" (OuterVolumeSpecName: "kube-api-access-rmzpt") pod "cd49f4f1-8a3f-4c73-a2b1-65f552d39926" (UID: "cd49f4f1-8a3f-4c73-a2b1-65f552d39926"). InnerVolumeSpecName "kube-api-access-rmzpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:21.368063 master-0 kubenswrapper[23041]: I0308 00:52:21.367562 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmzpt\" (UniqueName: \"kubernetes.io/projected/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-kube-api-access-rmzpt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.368063 master-0 kubenswrapper[23041]: I0308 00:52:21.367611 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd49f4f1-8a3f-4c73-a2b1-65f552d39926-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.778300 master-0 kubenswrapper[23041]: I0308 00:52:21.776089 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:21.880221 master-0 kubenswrapper[23041]: I0308 00:52:21.880139 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.880783 master-0 kubenswrapper[23041]: I0308 00:52:21.880504 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjjv\" (UniqueName: \"kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.880783 master-0 kubenswrapper[23041]: I0308 00:52:21.880591 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.881543 master-0 kubenswrapper[23041]: I0308 00:52:21.881509 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.882273 master-0 kubenswrapper[23041]: I0308 00:52:21.882251 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.882358 master-0 kubenswrapper[23041]: I0308 00:52:21.882328 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config\") pod \"0ba67633-35bb-450a-aeaa-9fb88d428e39\" (UID: \"0ba67633-35bb-450a-aeaa-9fb88d428e39\") " Mar 08 00:52:21.884753 master-0 kubenswrapper[23041]: I0308 00:52:21.884709 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv" (OuterVolumeSpecName: "kube-api-access-5jjjv") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "kube-api-access-5jjjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:21.936262 master-0 kubenswrapper[23041]: I0308 00:52:21.935345 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config" (OuterVolumeSpecName: "config") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.936262 master-0 kubenswrapper[23041]: I0308 00:52:21.935538 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.941401 master-0 kubenswrapper[23041]: I0308 00:52:21.941332 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.974473 master-0 kubenswrapper[23041]: I0308 00:52:21.974414 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:21.988447 master-0 kubenswrapper[23041]: I0308 00:52:21.985426 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjjv\" (UniqueName: \"kubernetes.io/projected/0ba67633-35bb-450a-aeaa-9fb88d428e39-kube-api-access-5jjjv\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.988447 master-0 kubenswrapper[23041]: I0308 00:52:21.985474 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.988447 master-0 kubenswrapper[23041]: I0308 00:52:21.985484 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.988447 master-0 kubenswrapper[23041]: I0308 00:52:21.985493 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:21.988447 master-0 kubenswrapper[23041]: I0308 00:52:21.985502 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:22.013085 master-0 kubenswrapper[23041]: I0308 00:52:22.013017 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ba67633-35bb-450a-aeaa-9fb88d428e39" (UID: "0ba67633-35bb-450a-aeaa-9fb88d428e39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:22.092702 master-0 kubenswrapper[23041]: I0308 00:52:22.092584 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba67633-35bb-450a-aeaa-9fb88d428e39-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:22.246855 master-0 kubenswrapper[23041]: I0308 00:52:22.246806 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" event={"ID":"98c1faca-b20d-4243-a40b-da58d311ddf6","Type":"ContainerStarted","Data":"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89"} Mar 08 00:52:22.247130 master-0 kubenswrapper[23041]: I0308 00:52:22.247115 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:22.252177 master-0 kubenswrapper[23041]: I0308 00:52:22.252123 23041 generic.go:334] "Generic (PLEG): container finished" podID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerID="e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d" exitCode=0 Mar 08 00:52:22.252177 master-0 kubenswrapper[23041]: I0308 00:52:22.252173 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd6667767-7bv69" event={"ID":"0ba67633-35bb-450a-aeaa-9fb88d428e39","Type":"ContainerDied","Data":"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d"} Mar 08 00:52:22.252567 master-0 kubenswrapper[23041]: I0308 00:52:22.252217 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd6667767-7bv69" event={"ID":"0ba67633-35bb-450a-aeaa-9fb88d428e39","Type":"ContainerDied","Data":"9c1da92e742101a9ee8f7cbf67325455678f35af1f0f289d913f0055a86a803a"} Mar 08 00:52:22.252567 master-0 kubenswrapper[23041]: I0308 00:52:22.252240 23041 scope.go:117] "RemoveContainer" containerID="e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d" Mar 08 00:52:22.252567 master-0 kubenswrapper[23041]: I0308 00:52:22.252396 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd6667767-7bv69" Mar 08 00:52:22.311111 master-0 kubenswrapper[23041]: I0308 00:52:22.310772 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" podStartSLOduration=3.310752186 podStartE2EDuration="3.310752186s" podCreationTimestamp="2026-03-08 00:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:22.270928374 +0000 UTC m=+1247.743764938" watchObservedRunningTime="2026-03-08 00:52:22.310752186 +0000 UTC m=+1247.783588750" Mar 08 00:52:22.336458 master-0 kubenswrapper[23041]: I0308 00:52:22.336416 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:22.352618 master-0 kubenswrapper[23041]: I0308 00:52:22.352468 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dd6667767-7bv69"] Mar 08 00:52:22.819813 master-0 kubenswrapper[23041]: I0308 00:52:22.819740 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" path="/var/lib/kubelet/pods/0ba67633-35bb-450a-aeaa-9fb88d428e39/volumes" Mar 08 00:52:24.748445 master-0 kubenswrapper[23041]: I0308 00:52:24.747413 23041 scope.go:117] "RemoveContainer" containerID="bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0" Mar 08 00:52:24.797943 master-0 kubenswrapper[23041]: I0308 00:52:24.797885 23041 scope.go:117] "RemoveContainer" containerID="e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d" Mar 08 00:52:24.798702 master-0 kubenswrapper[23041]: E0308 00:52:24.798641 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d\": container with ID starting with e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d not found: ID does not exist" containerID="e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d" Mar 08 00:52:24.798780 master-0 kubenswrapper[23041]: I0308 00:52:24.798693 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d"} err="failed to get container status \"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d\": rpc error: code = NotFound desc = could not find container \"e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d\": container with ID starting with e72880eb4cf53789436d58d38335f2e7c6f1d407449ea14ddf8200eccf581a9d not found: ID does not exist" Mar 08 00:52:24.798780 master-0 kubenswrapper[23041]: I0308 00:52:24.798726 23041 scope.go:117] "RemoveContainer" containerID="bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0" Mar 08 00:52:24.799407 master-0 kubenswrapper[23041]: E0308 00:52:24.799360 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0\": container with ID starting with bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0 not found: ID does not exist" containerID="bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0" Mar 08 00:52:24.799463 master-0 kubenswrapper[23041]: I0308 00:52:24.799413 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0"} err="failed to get container status \"bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0\": rpc error: code = NotFound desc = could not find container \"bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0\": container with ID starting with bc179f54b6b4e1cc8e99d888bd98789601dc08c09fb1d0f4e08f2b46c045b3b0 not found: ID does not exist" Mar 08 00:52:25.296789 master-0 kubenswrapper[23041]: I0308 00:52:25.296637 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mz8x8" event={"ID":"274fc42b-c842-4a84-b407-2e9bd971a75b","Type":"ContainerStarted","Data":"db025edfa385d00e2189d07cb5e3a2e77391c81156a7bc8d0d2dd08409dbca73"} Mar 08 00:52:25.345922 master-0 kubenswrapper[23041]: I0308 00:52:25.345779 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mz8x8" podStartSLOduration=1.940247881 podStartE2EDuration="8.34575315s" podCreationTimestamp="2026-03-08 00:52:17 +0000 UTC" firstStartedPulling="2026-03-08 00:52:18.412794723 +0000 UTC m=+1243.885631277" lastFinishedPulling="2026-03-08 00:52:24.818299992 +0000 UTC m=+1250.291136546" observedRunningTime="2026-03-08 00:52:25.324343607 +0000 UTC m=+1250.797180181" watchObservedRunningTime="2026-03-08 00:52:25.34575315 +0000 UTC m=+1250.818589704" Mar 08 00:52:30.041157 master-0 kubenswrapper[23041]: I0308 00:52:30.041100 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:30.431274 master-0 kubenswrapper[23041]: I0308 00:52:30.431118 23041 generic.go:334] "Generic (PLEG): container finished" podID="274fc42b-c842-4a84-b407-2e9bd971a75b" containerID="db025edfa385d00e2189d07cb5e3a2e77391c81156a7bc8d0d2dd08409dbca73" exitCode=0 Mar 08 00:52:30.431274 master-0 kubenswrapper[23041]: I0308 00:52:30.431169 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mz8x8" event={"ID":"274fc42b-c842-4a84-b407-2e9bd971a75b","Type":"ContainerDied","Data":"db025edfa385d00e2189d07cb5e3a2e77391c81156a7bc8d0d2dd08409dbca73"} Mar 08 00:52:31.834903 master-0 kubenswrapper[23041]: I0308 00:52:31.834827 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:31.975808 master-0 kubenswrapper[23041]: I0308 00:52:31.975721 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle\") pod \"274fc42b-c842-4a84-b407-2e9bd971a75b\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " Mar 08 00:52:31.976058 master-0 kubenswrapper[23041]: I0308 00:52:31.975839 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data\") pod \"274fc42b-c842-4a84-b407-2e9bd971a75b\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " Mar 08 00:52:31.976163 master-0 kubenswrapper[23041]: I0308 00:52:31.976113 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2w7k\" (UniqueName: \"kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k\") pod \"274fc42b-c842-4a84-b407-2e9bd971a75b\" (UID: \"274fc42b-c842-4a84-b407-2e9bd971a75b\") " Mar 08 00:52:31.979353 master-0 kubenswrapper[23041]: I0308 00:52:31.979174 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k" (OuterVolumeSpecName: "kube-api-access-v2w7k") pod "274fc42b-c842-4a84-b407-2e9bd971a75b" (UID: "274fc42b-c842-4a84-b407-2e9bd971a75b"). InnerVolumeSpecName "kube-api-access-v2w7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:31.999998 master-0 kubenswrapper[23041]: I0308 00:52:31.999895 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "274fc42b-c842-4a84-b407-2e9bd971a75b" (UID: "274fc42b-c842-4a84-b407-2e9bd971a75b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:32.021819 master-0 kubenswrapper[23041]: I0308 00:52:32.021740 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data" (OuterVolumeSpecName: "config-data") pod "274fc42b-c842-4a84-b407-2e9bd971a75b" (UID: "274fc42b-c842-4a84-b407-2e9bd971a75b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:32.079758 master-0 kubenswrapper[23041]: I0308 00:52:32.079689 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:32.079758 master-0 kubenswrapper[23041]: I0308 00:52:32.079736 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274fc42b-c842-4a84-b407-2e9bd971a75b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:32.079758 master-0 kubenswrapper[23041]: I0308 00:52:32.079747 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2w7k\" (UniqueName: \"kubernetes.io/projected/274fc42b-c842-4a84-b407-2e9bd971a75b-kube-api-access-v2w7k\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:32.456064 master-0 kubenswrapper[23041]: I0308 00:52:32.455932 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mz8x8" event={"ID":"274fc42b-c842-4a84-b407-2e9bd971a75b","Type":"ContainerDied","Data":"139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6"} Mar 08 00:52:32.456064 master-0 kubenswrapper[23041]: I0308 00:52:32.455987 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="139a226fbdf8955a88902f8ca6557d5e64746c06153456e6ffd11919143013e6" Mar 08 00:52:32.456064 master-0 kubenswrapper[23041]: I0308 00:52:32.456050 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mz8x8" Mar 08 00:52:36.046084 master-0 kubenswrapper[23041]: I0308 00:52:36.039647 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:52:36.046084 master-0 kubenswrapper[23041]: I0308 00:52:36.039900 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="dnsmasq-dns" containerID="cri-o://3f5c4c3c294763ff6fa374ef92214194edd69d216dd9e1dc398c649e28f084be" gracePeriod=10 Mar 08 00:52:36.512898 master-0 kubenswrapper[23041]: I0308 00:52:36.512846 23041 generic.go:334] "Generic (PLEG): container finished" podID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerID="3f5c4c3c294763ff6fa374ef92214194edd69d216dd9e1dc398c649e28f084be" exitCode=0 Mar 08 00:52:36.512898 master-0 kubenswrapper[23041]: I0308 00:52:36.512897 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" event={"ID":"0afbd7e3-8d43-4de7-8016-5183747a3db1","Type":"ContainerDied","Data":"3f5c4c3c294763ff6fa374ef92214194edd69d216dd9e1dc398c649e28f084be"} Mar 08 00:52:36.608622 master-0 kubenswrapper[23041]: I0308 00:52:36.608561 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:52:36.729185 master-0 kubenswrapper[23041]: I0308 00:52:36.729070 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb\") pod \"0afbd7e3-8d43-4de7-8016-5183747a3db1\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " Mar 08 00:52:36.729551 master-0 kubenswrapper[23041]: I0308 00:52:36.729466 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb\") pod \"0afbd7e3-8d43-4de7-8016-5183747a3db1\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " Mar 08 00:52:36.729775 master-0 kubenswrapper[23041]: I0308 00:52:36.729739 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config\") pod \"0afbd7e3-8d43-4de7-8016-5183747a3db1\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " Mar 08 00:52:36.730021 master-0 kubenswrapper[23041]: I0308 00:52:36.729980 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tn2j\" (UniqueName: \"kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j\") pod \"0afbd7e3-8d43-4de7-8016-5183747a3db1\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " Mar 08 00:52:36.730088 master-0 kubenswrapper[23041]: I0308 00:52:36.730032 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc\") pod \"0afbd7e3-8d43-4de7-8016-5183747a3db1\" (UID: \"0afbd7e3-8d43-4de7-8016-5183747a3db1\") " Mar 08 00:52:36.732871 master-0 kubenswrapper[23041]: I0308 00:52:36.732826 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j" (OuterVolumeSpecName: "kube-api-access-4tn2j") pod "0afbd7e3-8d43-4de7-8016-5183747a3db1" (UID: "0afbd7e3-8d43-4de7-8016-5183747a3db1"). InnerVolumeSpecName "kube-api-access-4tn2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:36.782081 master-0 kubenswrapper[23041]: I0308 00:52:36.782025 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0afbd7e3-8d43-4de7-8016-5183747a3db1" (UID: "0afbd7e3-8d43-4de7-8016-5183747a3db1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:36.802224 master-0 kubenswrapper[23041]: I0308 00:52:36.792386 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0afbd7e3-8d43-4de7-8016-5183747a3db1" (UID: "0afbd7e3-8d43-4de7-8016-5183747a3db1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:36.824349 master-0 kubenswrapper[23041]: I0308 00:52:36.822977 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config" (OuterVolumeSpecName: "config") pod "0afbd7e3-8d43-4de7-8016-5183747a3db1" (UID: "0afbd7e3-8d43-4de7-8016-5183747a3db1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:36.833139 master-0 kubenswrapper[23041]: I0308 00:52:36.833076 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:36.833139 master-0 kubenswrapper[23041]: I0308 00:52:36.833110 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:36.833139 master-0 kubenswrapper[23041]: I0308 00:52:36.833119 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:36.833139 master-0 kubenswrapper[23041]: I0308 00:52:36.833128 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tn2j\" (UniqueName: \"kubernetes.io/projected/0afbd7e3-8d43-4de7-8016-5183747a3db1-kube-api-access-4tn2j\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:36.837768 master-0 kubenswrapper[23041]: I0308 00:52:36.837735 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0afbd7e3-8d43-4de7-8016-5183747a3db1" (UID: "0afbd7e3-8d43-4de7-8016-5183747a3db1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:36.935510 master-0 kubenswrapper[23041]: I0308 00:52:36.935430 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afbd7e3-8d43-4de7-8016-5183747a3db1-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:37.524211 master-0 kubenswrapper[23041]: I0308 00:52:37.524130 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" event={"ID":"0afbd7e3-8d43-4de7-8016-5183747a3db1","Type":"ContainerDied","Data":"be52c5deecf3088082976fa8099a7287ee459e33921a5df3b63a93d762851fa5"} Mar 08 00:52:37.524765 master-0 kubenswrapper[23041]: I0308 00:52:37.524231 23041 scope.go:117] "RemoveContainer" containerID="3f5c4c3c294763ff6fa374ef92214194edd69d216dd9e1dc398c649e28f084be" Mar 08 00:52:37.524765 master-0 kubenswrapper[23041]: I0308 00:52:37.524461 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm" Mar 08 00:52:37.551637 master-0 kubenswrapper[23041]: I0308 00:52:37.551589 23041 scope.go:117] "RemoveContainer" containerID="eb10640cf6e20d214aac1741c7a761383b3661d62b126d734de6010883ff4d31" Mar 08 00:52:37.991127 master-0 kubenswrapper[23041]: I0308 00:52:37.990983 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991841 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274fc42b-c842-4a84-b407-2e9bd971a75b" containerName="keystone-db-sync" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.991869 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="274fc42b-c842-4a84-b407-2e9bd971a75b" containerName="keystone-db-sync" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991897 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.991905 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991925 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.991933 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991951 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a404f8-c225-48fb-987f-e77945275680" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.991960 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a404f8-c225-48fb-987f-e77945275680" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991976 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd49f4f1-8a3f-4c73-a2b1-65f552d39926" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.991984 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd49f4f1-8a3f-4c73-a2b1-65f552d39926" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.991995 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f976074-a681-4867-96f3-089f7cfabd9e" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992004 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f976074-a681-4867-96f3-089f7cfabd9e" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.992025 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="init" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992035 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="init" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.992055 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="init" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992062 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="init" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: E0308 00:52:37.992087 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992095 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992379 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992406 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd49f4f1-8a3f-4c73-a2b1-65f552d39926" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992422 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992435 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a404f8-c225-48fb-987f-e77945275680" containerName="mariadb-database-create" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992458 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f976074-a681-4867-96f3-089f7cfabd9e" containerName="mariadb-account-create-update" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992473 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba67633-35bb-450a-aeaa-9fb88d428e39" containerName="dnsmasq-dns" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.992487 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="274fc42b-c842-4a84-b407-2e9bd971a75b" containerName="keystone-db-sync" Mar 08 00:52:37.994301 master-0 kubenswrapper[23041]: I0308 00:52:37.993739 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.173907 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.174019 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.174052 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.174102 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.174137 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.185516 master-0 kubenswrapper[23041]: I0308 00:52:38.174160 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqgm\" (UniqueName: \"kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.189680 master-0 kubenswrapper[23041]: I0308 00:52:38.189621 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4czgd"] Mar 08 00:52:38.191400 master-0 kubenswrapper[23041]: I0308 00:52:38.191373 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.194351 master-0 kubenswrapper[23041]: I0308 00:52:38.194068 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 00:52:38.194351 master-0 kubenswrapper[23041]: I0308 00:52:38.194286 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 00:52:38.194667 master-0 kubenswrapper[23041]: I0308 00:52:38.194427 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 00:52:38.194667 master-0 kubenswrapper[23041]: I0308 00:52:38.194559 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 00:52:38.199686 master-0 kubenswrapper[23041]: I0308 00:52:38.199627 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:38.276438 master-0 kubenswrapper[23041]: I0308 00:52:38.276362 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.276438 master-0 kubenswrapper[23041]: I0308 00:52:38.276448 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.276903 master-0 kubenswrapper[23041]: I0308 00:52:38.276876 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqgm\" (UniqueName: \"kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.277144 master-0 kubenswrapper[23041]: I0308 00:52:38.277116 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.277431 master-0 kubenswrapper[23041]: I0308 00:52:38.277409 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.277575 master-0 kubenswrapper[23041]: I0308 00:52:38.277556 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.277739 master-0 kubenswrapper[23041]: I0308 00:52:38.277568 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.277996 master-0 kubenswrapper[23041]: I0308 00:52:38.277959 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.278342 master-0 kubenswrapper[23041]: I0308 00:52:38.278308 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.278418 master-0 kubenswrapper[23041]: I0308 00:52:38.278332 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.278418 master-0 kubenswrapper[23041]: I0308 00:52:38.278362 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.371231 master-0 kubenswrapper[23041]: I0308 00:52:38.367267 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383438 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383531 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383567 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtqx\" (UniqueName: \"kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383594 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383670 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.394231 master-0 kubenswrapper[23041]: I0308 00:52:38.383715 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.396943 master-0 kubenswrapper[23041]: I0308 00:52:38.395569 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqgm\" (UniqueName: \"kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm\") pod \"dnsmasq-dns-5584778f8f-xqg9r\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.396943 master-0 kubenswrapper[23041]: I0308 00:52:38.396895 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4czgd"] Mar 08 00:52:38.405732 master-0 kubenswrapper[23041]: I0308 00:52:38.405676 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-dmhrm"] Mar 08 00:52:38.487382 master-0 kubenswrapper[23041]: I0308 00:52:38.487331 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.487382 master-0 kubenswrapper[23041]: I0308 00:52:38.487390 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.487703 master-0 kubenswrapper[23041]: I0308 00:52:38.487431 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtqx\" (UniqueName: \"kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.487703 master-0 kubenswrapper[23041]: I0308 00:52:38.487460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.487703 master-0 kubenswrapper[23041]: I0308 00:52:38.487516 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.487703 master-0 kubenswrapper[23041]: I0308 00:52:38.487544 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.491370 master-0 kubenswrapper[23041]: I0308 00:52:38.491284 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.498110 master-0 kubenswrapper[23041]: I0308 00:52:38.497556 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.498110 master-0 kubenswrapper[23041]: I0308 00:52:38.497719 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.498619 master-0 kubenswrapper[23041]: I0308 00:52:38.498171 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.504087 master-0 kubenswrapper[23041]: I0308 00:52:38.504020 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.610623 master-0 kubenswrapper[23041]: I0308 00:52:38.609652 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:38.673074 master-0 kubenswrapper[23041]: I0308 00:52:38.672289 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtqx\" (UniqueName: \"kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx\") pod \"keystone-bootstrap-4czgd\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.813657 master-0 kubenswrapper[23041]: I0308 00:52:38.813479 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:38.886395 master-0 kubenswrapper[23041]: I0308 00:52:38.886333 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afbd7e3-8d43-4de7-8016-5183747a3db1" path="/var/lib/kubelet/pods/0afbd7e3-8d43-4de7-8016-5183747a3db1/volumes" Mar 08 00:52:39.037225 master-0 kubenswrapper[23041]: I0308 00:52:39.033661 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-9pxq8"] Mar 08 00:52:39.037225 master-0 kubenswrapper[23041]: I0308 00:52:39.035491 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.044223 master-0 kubenswrapper[23041]: I0308 00:52:39.043316 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-db-sync-8zxxl"] Mar 08 00:52:39.049264 master-0 kubenswrapper[23041]: I0308 00:52:39.044608 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.049264 master-0 kubenswrapper[23041]: I0308 00:52:39.046302 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-config-data" Mar 08 00:52:39.049264 master-0 kubenswrapper[23041]: I0308 00:52:39.047234 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-scripts" Mar 08 00:52:39.198529 master-0 kubenswrapper[23041]: I0308 00:52:39.197822 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9pxq8"] Mar 08 00:52:39.200334 master-0 kubenswrapper[23041]: I0308 00:52:39.200288 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2j6\" (UniqueName: \"kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.200496 master-0 kubenswrapper[23041]: I0308 00:52:39.200353 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.200496 master-0 kubenswrapper[23041]: I0308 00:52:39.200411 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.200496 master-0 kubenswrapper[23041]: I0308 00:52:39.200469 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8f8q\" (UniqueName: \"kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.200699 master-0 kubenswrapper[23041]: I0308 00:52:39.200527 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.200778 master-0 kubenswrapper[23041]: I0308 00:52:39.200712 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.200867 master-0 kubenswrapper[23041]: I0308 00:52:39.200775 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.200939 master-0 kubenswrapper[23041]: I0308 00:52:39.200884 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.214399 master-0 kubenswrapper[23041]: I0308 00:52:39.214327 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-db-sync-8zxxl"] Mar 08 00:52:39.217784 master-0 kubenswrapper[23041]: I0308 00:52:39.217723 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bxnnn"] Mar 08 00:52:39.218997 master-0 kubenswrapper[23041]: I0308 00:52:39.218973 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.220683 master-0 kubenswrapper[23041]: I0308 00:52:39.220633 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 00:52:39.221268 master-0 kubenswrapper[23041]: I0308 00:52:39.221244 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 00:52:39.283908 master-0 kubenswrapper[23041]: I0308 00:52:39.283852 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.301897 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bxnnn"] Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303415 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303536 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303573 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2j6\" (UniqueName: \"kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303600 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303639 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303706 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8f8q\" (UniqueName: \"kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303747 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303787 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303820 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303840 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.304523 master-0 kubenswrapper[23041]: I0308 00:52:39.303925 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzxg\" (UniqueName: \"kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.309452 master-0 kubenswrapper[23041]: I0308 00:52:39.308019 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.309452 master-0 kubenswrapper[23041]: I0308 00:52:39.308070 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.309452 master-0 kubenswrapper[23041]: I0308 00:52:39.309397 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.309851 master-0 kubenswrapper[23041]: I0308 00:52:39.309471 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.311773 master-0 kubenswrapper[23041]: I0308 00:52:39.311734 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.312230 master-0 kubenswrapper[23041]: I0308 00:52:39.312193 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.328387 master-0 kubenswrapper[23041]: W0308 00:52:39.327636 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60432080_f735_4274_970e_58d2fa71550f.slice/crio-d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648 WatchSource:0}: Error finding container d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648: Status 404 returned error can't find the container with id d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648 Mar 08 00:52:39.394258 master-0 kubenswrapper[23041]: I0308 00:52:39.391967 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2j6\" (UniqueName: \"kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6\") pod \"cinder-675ba-db-sync-8zxxl\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.405718 master-0 kubenswrapper[23041]: I0308 00:52:39.405631 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.405918 master-0 kubenswrapper[23041]: I0308 00:52:39.405843 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.405955 master-0 kubenswrapper[23041]: I0308 00:52:39.405922 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzxg\" (UniqueName: \"kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.410952 master-0 kubenswrapper[23041]: I0308 00:52:39.410906 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.420857 master-0 kubenswrapper[23041]: I0308 00:52:39.420414 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.420857 master-0 kubenswrapper[23041]: I0308 00:52:39.420707 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8f8q\" (UniqueName: \"kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q\") pod \"ironic-db-create-9pxq8\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.528228 master-0 kubenswrapper[23041]: I0308 00:52:39.527573 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4czgd"] Mar 08 00:52:39.591131 master-0 kubenswrapper[23041]: I0308 00:52:39.591063 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" event={"ID":"60432080-f735-4274-970e-58d2fa71550f","Type":"ContainerStarted","Data":"d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648"} Mar 08 00:52:39.593027 master-0 kubenswrapper[23041]: I0308 00:52:39.592986 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4czgd" event={"ID":"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed","Type":"ContainerStarted","Data":"fed90c97670ca2ad7411d626257c615e08871169fb088c56384207d15d40dd4a"} Mar 08 00:52:39.646247 master-0 kubenswrapper[23041]: I0308 00:52:39.644286 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-39c2-account-create-update-jbfp7"] Mar 08 00:52:39.646247 master-0 kubenswrapper[23041]: I0308 00:52:39.646043 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.647373 master-0 kubenswrapper[23041]: I0308 00:52:39.647302 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzxg\" (UniqueName: \"kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg\") pod \"neutron-db-sync-bxnnn\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.648446 master-0 kubenswrapper[23041]: I0308 00:52:39.648409 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 08 00:52:39.661468 master-0 kubenswrapper[23041]: I0308 00:52:39.661411 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:39.671507 master-0 kubenswrapper[23041]: I0308 00:52:39.670962 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:52:39.716242 master-0 kubenswrapper[23041]: I0308 00:52:39.713615 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.716242 master-0 kubenswrapper[23041]: I0308 00:52:39.713865 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48kw\" (UniqueName: \"kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.818986 master-0 kubenswrapper[23041]: I0308 00:52:39.817634 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48kw\" (UniqueName: \"kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.818986 master-0 kubenswrapper[23041]: I0308 00:52:39.817749 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.822953 master-0 kubenswrapper[23041]: I0308 00:52:39.822896 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:39.838608 master-0 kubenswrapper[23041]: I0308 00:52:39.838492 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:52:39.964356 master-0 kubenswrapper[23041]: I0308 00:52:39.964150 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-39c2-account-create-update-jbfp7"] Mar 08 00:52:39.964496 master-0 kubenswrapper[23041]: I0308 00:52:39.964374 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48kw\" (UniqueName: \"kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw\") pod \"ironic-39c2-account-create-update-jbfp7\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:40.016686 master-0 kubenswrapper[23041]: I0308 00:52:40.015390 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:40.626311 master-0 kubenswrapper[23041]: I0308 00:52:40.625639 23041 generic.go:334] "Generic (PLEG): container finished" podID="60432080-f735-4274-970e-58d2fa71550f" containerID="1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86" exitCode=0 Mar 08 00:52:40.626311 master-0 kubenswrapper[23041]: I0308 00:52:40.625735 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" event={"ID":"60432080-f735-4274-970e-58d2fa71550f","Type":"ContainerDied","Data":"1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86"} Mar 08 00:52:40.630000 master-0 kubenswrapper[23041]: I0308 00:52:40.629949 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4czgd" event={"ID":"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed","Type":"ContainerStarted","Data":"b61989e3130b4a8de0150f5c936dfb8c14d3db55340921bd2a521eb497b101dd"} Mar 08 00:52:41.029194 master-0 kubenswrapper[23041]: I0308 00:52:41.029116 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8rrvj"] Mar 08 00:52:41.030896 master-0 kubenswrapper[23041]: I0308 00:52:41.030866 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.036686 master-0 kubenswrapper[23041]: I0308 00:52:41.036036 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 00:52:41.036686 master-0 kubenswrapper[23041]: I0308 00:52:41.036337 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 00:52:41.086726 master-0 kubenswrapper[23041]: E0308 00:52:41.086669 23041 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 08 00:52:41.086726 master-0 kubenswrapper[23041]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/60432080-f735-4274-970e-58d2fa71550f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 08 00:52:41.086726 master-0 kubenswrapper[23041]: > podSandboxID="d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648" Mar 08 00:52:41.086881 master-0 kubenswrapper[23041]: E0308 00:52:41.086856 23041 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:52:41.086881 master-0 kubenswrapper[23041]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67dh645h59dh88h6dhfdh587h66ch566h97h5dch54h598h676h56bh55dh96h588h5d7h5fbh557h57bhfch687h55h96h5fdh549h55h5d7h588h64bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqqgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5584778f8f-xqg9r_openstack(60432080-f735-4274-970e-58d2fa71550f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/60432080-f735-4274-970e-58d2fa71550f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 08 00:52:41.086881 master-0 kubenswrapper[23041]: > logger="UnhandledError" Mar 08 00:52:41.088093 master-0 kubenswrapper[23041]: E0308 00:52:41.088048 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/60432080-f735-4274-970e-58d2fa71550f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" podUID="60432080-f735-4274-970e-58d2fa71550f" Mar 08 00:52:41.170919 master-0 kubenswrapper[23041]: I0308 00:52:41.170850 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkv8\" (UniqueName: \"kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.171100 master-0 kubenswrapper[23041]: I0308 00:52:41.171059 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.171412 master-0 kubenswrapper[23041]: I0308 00:52:41.171360 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.171519 master-0 kubenswrapper[23041]: I0308 00:52:41.171494 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.171831 master-0 kubenswrapper[23041]: I0308 00:52:41.171793 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.273980 master-0 kubenswrapper[23041]: I0308 00:52:41.273873 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.274367 master-0 kubenswrapper[23041]: I0308 00:52:41.274043 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkv8\" (UniqueName: \"kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.274367 master-0 kubenswrapper[23041]: I0308 00:52:41.274301 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.274577 master-0 kubenswrapper[23041]: I0308 00:52:41.274528 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.274577 master-0 kubenswrapper[23041]: I0308 00:52:41.274532 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.274727 master-0 kubenswrapper[23041]: I0308 00:52:41.274692 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.278567 master-0 kubenswrapper[23041]: I0308 00:52:41.278511 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.278693 master-0 kubenswrapper[23041]: I0308 00:52:41.278631 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.280109 master-0 kubenswrapper[23041]: I0308 00:52:41.280052 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.548346 master-0 kubenswrapper[23041]: I0308 00:52:41.548283 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-db-sync-8zxxl"] Mar 08 00:52:41.552332 master-0 kubenswrapper[23041]: W0308 00:52:41.552209 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309c80e9_6a3a_45cb_93c9_216d39c74f61.slice/crio-8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768 WatchSource:0}: Error finding container 8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768: Status 404 returned error can't find the container with id 8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768 Mar 08 00:52:41.559758 master-0 kubenswrapper[23041]: W0308 00:52:41.559708 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74432df1_6a53_4258_932b_6e6aa6c23448.slice/crio-c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d WatchSource:0}: Error finding container c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d: Status 404 returned error can't find the container with id c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d Mar 08 00:52:41.564866 master-0 kubenswrapper[23041]: W0308 00:52:41.564818 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef69671_8e3b_456f_9764_212721fba8e0.slice/crio-f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226 WatchSource:0}: Error finding container f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226: Status 404 returned error can't find the container with id f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226 Mar 08 00:52:41.565487 master-0 kubenswrapper[23041]: I0308 00:52:41.565461 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bxnnn"] Mar 08 00:52:41.590540 master-0 kubenswrapper[23041]: I0308 00:52:41.590460 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9pxq8"] Mar 08 00:52:41.611136 master-0 kubenswrapper[23041]: I0308 00:52:41.611069 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8rrvj"] Mar 08 00:52:41.643389 master-0 kubenswrapper[23041]: I0308 00:52:41.643265 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-db-sync-8zxxl" event={"ID":"309c80e9-6a3a-45cb-93c9-216d39c74f61","Type":"ContainerStarted","Data":"8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768"} Mar 08 00:52:41.644995 master-0 kubenswrapper[23041]: I0308 00:52:41.644952 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxnnn" event={"ID":"9ef69671-8e3b-456f-9764-212721fba8e0","Type":"ContainerStarted","Data":"f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226"} Mar 08 00:52:41.647292 master-0 kubenswrapper[23041]: I0308 00:52:41.647249 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9pxq8" event={"ID":"74432df1-6a53-4258-932b-6e6aa6c23448","Type":"ContainerStarted","Data":"c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d"} Mar 08 00:52:41.785869 master-0 kubenswrapper[23041]: W0308 00:52:41.785800 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfaf8fd1_f241_4f4b_ab62_4d04cef718dd.slice/crio-db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a WatchSource:0}: Error finding container db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a: Status 404 returned error can't find the container with id db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a Mar 08 00:52:41.788000 master-0 kubenswrapper[23041]: I0308 00:52:41.787944 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-39c2-account-create-update-jbfp7"] Mar 08 00:52:41.888543 master-0 kubenswrapper[23041]: I0308 00:52:41.888496 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkv8\" (UniqueName: \"kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8\") pod \"placement-db-sync-8rrvj\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:41.915707 master-0 kubenswrapper[23041]: I0308 00:52:41.906295 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:41.968866 master-0 kubenswrapper[23041]: I0308 00:52:41.953716 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8rrvj" Mar 08 00:52:42.000224 master-0 kubenswrapper[23041]: I0308 00:52:41.995906 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:52:42.015232 master-0 kubenswrapper[23041]: I0308 00:52:42.006702 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.044240 master-0 kubenswrapper[23041]: I0308 00:52:42.043249 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4czgd" podStartSLOduration=5.043223072 podStartE2EDuration="5.043223072s" podCreationTimestamp="2026-03-08 00:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:41.968362384 +0000 UTC m=+1267.441198938" watchObservedRunningTime="2026-03-08 00:52:42.043223072 +0000 UTC m=+1267.516059626" Mar 08 00:52:42.062938 master-0 kubenswrapper[23041]: I0308 00:52:42.062880 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.063149 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhws\" (UniqueName: \"kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.063564 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.063631 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.063650 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.063670 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.077110 master-0 kubenswrapper[23041]: I0308 00:52:42.067877 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.174895 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhws\" (UniqueName: \"kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.175022 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.175080 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.175108 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.175134 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.175191 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.176068 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.176312 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.176689 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.177388 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.180235 master-0 kubenswrapper[23041]: I0308 00:52:42.177414 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.243226 master-0 kubenswrapper[23041]: I0308 00:52:42.243133 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhws\" (UniqueName: \"kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws\") pod \"dnsmasq-dns-846459fb55-9x6r8\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.370742 master-0 kubenswrapper[23041]: I0308 00:52:42.370637 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:42.752673 master-0 kubenswrapper[23041]: I0308 00:52:42.752100 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" event={"ID":"60432080-f735-4274-970e-58d2fa71550f","Type":"ContainerStarted","Data":"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3"} Mar 08 00:52:42.752673 master-0 kubenswrapper[23041]: I0308 00:52:42.752434 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="dnsmasq-dns" containerID="cri-o://94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3" gracePeriod=10 Mar 08 00:52:42.755073 master-0 kubenswrapper[23041]: I0308 00:52:42.752784 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:42.770730 master-0 kubenswrapper[23041]: I0308 00:52:42.770456 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-39c2-account-create-update-jbfp7" event={"ID":"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd","Type":"ContainerStarted","Data":"db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a"} Mar 08 00:52:42.807257 master-0 kubenswrapper[23041]: I0308 00:52:42.792737 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9pxq8" event={"ID":"74432df1-6a53-4258-932b-6e6aa6c23448","Type":"ContainerStarted","Data":"e591b2dfeb1423bd2b97243237ecba4ede86e72dca2ccd100d820256d41d4a91"} Mar 08 00:52:42.807257 master-0 kubenswrapper[23041]: I0308 00:52:42.801749 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" podStartSLOduration=5.801724261 podStartE2EDuration="5.801724261s" podCreationTimestamp="2026-03-08 00:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:42.789350059 +0000 UTC m=+1268.262186623" watchObservedRunningTime="2026-03-08 00:52:42.801724261 +0000 UTC m=+1268.274560815" Mar 08 00:52:42.861106 master-0 kubenswrapper[23041]: I0308 00:52:42.860980 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxnnn" event={"ID":"9ef69671-8e3b-456f-9764-212721fba8e0","Type":"ContainerStarted","Data":"6fa2fc35d099d15db013f4024a180e05dbdce3a40bcd31c527ded344118bf564"} Mar 08 00:52:42.885355 master-0 kubenswrapper[23041]: I0308 00:52:42.882235 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:42.896101 master-0 kubenswrapper[23041]: E0308 00:52:42.886659 23041 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74432df1_6a53_4258_932b_6e6aa6c23448.slice/crio-conmon-e591b2dfeb1423bd2b97243237ecba4ede86e72dca2ccd100d820256d41d4a91.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:52:42.896101 master-0 kubenswrapper[23041]: I0308 00:52:42.886786 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:42.896101 master-0 kubenswrapper[23041]: I0308 00:52:42.890171 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-external-config-data" Mar 08 00:52:42.896101 master-0 kubenswrapper[23041]: I0308 00:52:42.892015 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 00:52:42.896101 master-0 kubenswrapper[23041]: I0308 00:52:42.895693 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 00:52:42.899131 master-0 kubenswrapper[23041]: I0308 00:52:42.898191 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-create-9pxq8" podStartSLOduration=4.898164976 podStartE2EDuration="4.898164976s" podCreationTimestamp="2026-03-08 00:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:42.883101718 +0000 UTC m=+1268.355938262" watchObservedRunningTime="2026-03-08 00:52:42.898164976 +0000 UTC m=+1268.371001530" Mar 08 00:52:42.915315 master-0 kubenswrapper[23041]: I0308 00:52:42.915163 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8rrvj"] Mar 08 00:52:42.928834 master-0 kubenswrapper[23041]: I0308 00:52:42.928769 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bxnnn" podStartSLOduration=4.928749563 podStartE2EDuration="4.928749563s" podCreationTimestamp="2026-03-08 00:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:42.914665019 +0000 UTC m=+1268.387501583" watchObservedRunningTime="2026-03-08 00:52:42.928749563 +0000 UTC m=+1268.401586117" Mar 08 00:52:42.930505 master-0 kubenswrapper[23041]: I0308 00:52:42.930316 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:43.024675 master-0 kubenswrapper[23041]: I0308 00:52:43.024618 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024683 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024711 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024760 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024799 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzgvv\" (UniqueName: \"kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024845 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024884 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.024957 master-0 kubenswrapper[23041]: I0308 00:52:43.024928 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.138647 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.139799 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.140008 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.140070 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.140169 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.140676 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzgvv\" (UniqueName: \"kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.140809 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.141116 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.141240 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.142732 master-0 kubenswrapper[23041]: I0308 00:52:43.141566 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.151797 master-0 kubenswrapper[23041]: I0308 00:52:43.145448 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:52:43.151797 master-0 kubenswrapper[23041]: I0308 00:52:43.148627 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.151797 master-0 kubenswrapper[23041]: I0308 00:52:43.150148 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:52:43.151797 master-0 kubenswrapper[23041]: I0308 00:52:43.150190 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/fba811848206921d87eed675e9d53cf2e2311d13264fbd23c3492e9c4520fe29/globalmount\"" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.151797 master-0 kubenswrapper[23041]: I0308 00:52:43.151064 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.156490 master-0 kubenswrapper[23041]: I0308 00:52:43.155791 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.164395 master-0 kubenswrapper[23041]: I0308 00:52:43.163261 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzgvv\" (UniqueName: \"kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.177235 master-0 kubenswrapper[23041]: I0308 00:52:43.177121 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:43.573657 master-0 kubenswrapper[23041]: I0308 00:52:43.572170 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:43.672677 master-0 kubenswrapper[23041]: I0308 00:52:43.672606 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqqgm\" (UniqueName: \"kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.672677 master-0 kubenswrapper[23041]: I0308 00:52:43.672665 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.672964 master-0 kubenswrapper[23041]: I0308 00:52:43.672729 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.672964 master-0 kubenswrapper[23041]: I0308 00:52:43.672947 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.673030 master-0 kubenswrapper[23041]: I0308 00:52:43.673008 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.673068 master-0 kubenswrapper[23041]: I0308 00:52:43.673031 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.706395 master-0 kubenswrapper[23041]: I0308 00:52:43.703548 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm" (OuterVolumeSpecName: "kube-api-access-tqqgm") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "kube-api-access-tqqgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:43.713374 master-0 kubenswrapper[23041]: I0308 00:52:43.713309 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:43.714194 master-0 kubenswrapper[23041]: E0308 00:52:43.714177 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="dnsmasq-dns" Mar 08 00:52:43.714435 master-0 kubenswrapper[23041]: I0308 00:52:43.714424 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="dnsmasq-dns" Mar 08 00:52:43.714747 master-0 kubenswrapper[23041]: E0308 00:52:43.714735 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="init" Mar 08 00:52:43.714839 master-0 kubenswrapper[23041]: I0308 00:52:43.714829 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="init" Mar 08 00:52:43.715751 master-0 kubenswrapper[23041]: I0308 00:52:43.715734 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="60432080-f735-4274-970e-58d2fa71550f" containerName="dnsmasq-dns" Mar 08 00:52:43.717631 master-0 kubenswrapper[23041]: I0308 00:52:43.717246 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.721312 master-0 kubenswrapper[23041]: I0308 00:52:43.721259 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 00:52:43.721733 master-0 kubenswrapper[23041]: I0308 00:52:43.721702 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-internal-config-data" Mar 08 00:52:43.747934 master-0 kubenswrapper[23041]: I0308 00:52:43.746891 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:43.774991 master-0 kubenswrapper[23041]: I0308 00:52:43.773709 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.777570 master-0 kubenswrapper[23041]: I0308 00:52:43.777517 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.788040 master-0 kubenswrapper[23041]: W0308 00:52:43.787897 23041 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/60432080-f735-4274-970e-58d2fa71550f/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 08 00:52:43.788040 master-0 kubenswrapper[23041]: I0308 00:52:43.787937 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.790937 master-0 kubenswrapper[23041]: I0308 00:52:43.789474 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") pod \"60432080-f735-4274-970e-58d2fa71550f\" (UID: \"60432080-f735-4274-970e-58d2fa71550f\") " Mar 08 00:52:43.791746 master-0 kubenswrapper[23041]: I0308 00:52:43.791705 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqqgm\" (UniqueName: \"kubernetes.io/projected/60432080-f735-4274-970e-58d2fa71550f-kube-api-access-tqqgm\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:43.791746 master-0 kubenswrapper[23041]: I0308 00:52:43.791728 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:43.791746 master-0 kubenswrapper[23041]: I0308 00:52:43.791738 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:43.831507 master-0 kubenswrapper[23041]: I0308 00:52:43.830287 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config" (OuterVolumeSpecName: "config") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.861462 master-0 kubenswrapper[23041]: I0308 00:52:43.860695 23041 generic.go:334] "Generic (PLEG): container finished" podID="60432080-f735-4274-970e-58d2fa71550f" containerID="94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3" exitCode=0 Mar 08 00:52:43.861462 master-0 kubenswrapper[23041]: I0308 00:52:43.860863 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" event={"ID":"60432080-f735-4274-970e-58d2fa71550f","Type":"ContainerDied","Data":"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3"} Mar 08 00:52:43.861462 master-0 kubenswrapper[23041]: I0308 00:52:43.860903 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" event={"ID":"60432080-f735-4274-970e-58d2fa71550f","Type":"ContainerDied","Data":"d5f7824d1e06062da1afd06aafdf7a60427bd966d828c7bbfd1c6b5abc1dc648"} Mar 08 00:52:43.861462 master-0 kubenswrapper[23041]: I0308 00:52:43.860928 23041 scope.go:117] "RemoveContainer" containerID="94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3" Mar 08 00:52:43.861462 master-0 kubenswrapper[23041]: I0308 00:52:43.861157 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584778f8f-xqg9r" Mar 08 00:52:43.868336 master-0 kubenswrapper[23041]: I0308 00:52:43.867716 23041 generic.go:334] "Generic (PLEG): container finished" podID="cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" containerID="80590e40ca7b769cd48b9e3222b8f5e97d534d881877bbcf124933dfae70a851" exitCode=0 Mar 08 00:52:43.868336 master-0 kubenswrapper[23041]: I0308 00:52:43.867887 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-39c2-account-create-update-jbfp7" event={"ID":"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd","Type":"ContainerDied","Data":"80590e40ca7b769cd48b9e3222b8f5e97d534d881877bbcf124933dfae70a851"} Mar 08 00:52:43.871579 master-0 kubenswrapper[23041]: I0308 00:52:43.871446 23041 generic.go:334] "Generic (PLEG): container finished" podID="74432df1-6a53-4258-932b-6e6aa6c23448" containerID="e591b2dfeb1423bd2b97243237ecba4ede86e72dca2ccd100d820256d41d4a91" exitCode=0 Mar 08 00:52:43.871579 master-0 kubenswrapper[23041]: I0308 00:52:43.871521 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9pxq8" event={"ID":"74432df1-6a53-4258-932b-6e6aa6c23448","Type":"ContainerDied","Data":"e591b2dfeb1423bd2b97243237ecba4ede86e72dca2ccd100d820256d41d4a91"} Mar 08 00:52:43.885228 master-0 kubenswrapper[23041]: I0308 00:52:43.882144 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" event={"ID":"4528bfc0-76dc-47be-b4e0-cfddeb378c94","Type":"ContainerStarted","Data":"d517b3c8ecd709447d94315b13c29e02c13f772acaeeeaafcb6cacac4846b82f"} Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.900383 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8rrvj" event={"ID":"88a16b7d-3a7c-4b23-9f7c-448fea1247e1","Type":"ContainerStarted","Data":"84166c7b8a1bea673bc799b61ba0f0a536fa212c315c56d2df5d9f316f477717"} Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.901886 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.901967 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvm49\" (UniqueName: \"kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902432 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902628 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902649 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902693 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902723 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902749 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.902942 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:43.911630 master-0 kubenswrapper[23041]: I0308 00:52:43.910077 23041 scope.go:117] "RemoveContainer" containerID="1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86" Mar 08 00:52:43.936086 master-0 kubenswrapper[23041]: I0308 00:52:43.935945 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.975910 master-0 kubenswrapper[23041]: I0308 00:52:43.975843 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60432080-f735-4274-970e-58d2fa71550f" (UID: "60432080-f735-4274-970e-58d2fa71550f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:43.976717 master-0 kubenswrapper[23041]: I0308 00:52:43.976621 23041 scope.go:117] "RemoveContainer" containerID="94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3" Mar 08 00:52:43.978011 master-0 kubenswrapper[23041]: E0308 00:52:43.977642 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3\": container with ID starting with 94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3 not found: ID does not exist" containerID="94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3" Mar 08 00:52:43.978432 master-0 kubenswrapper[23041]: I0308 00:52:43.978119 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3"} err="failed to get container status \"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3\": rpc error: code = NotFound desc = could not find container \"94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3\": container with ID starting with 94d85430e09588b175b7af25e835389d40a2deeef634f44c26c99926aa3f6ed3 not found: ID does not exist" Mar 08 00:52:43.978432 master-0 kubenswrapper[23041]: I0308 00:52:43.978367 23041 scope.go:117] "RemoveContainer" containerID="1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86" Mar 08 00:52:43.979655 master-0 kubenswrapper[23041]: E0308 00:52:43.979440 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86\": container with ID starting with 1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86 not found: ID does not exist" containerID="1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86" Mar 08 00:52:43.979655 master-0 kubenswrapper[23041]: I0308 00:52:43.979502 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86"} err="failed to get container status \"1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86\": rpc error: code = NotFound desc = could not find container \"1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86\": container with ID starting with 1f3057176e0dd4562102f70b1be50a4ad34023eeb08b64d84f58b174131e2e86 not found: ID does not exist" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006589 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006675 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvm49\" (UniqueName: \"kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006713 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006845 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006863 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006879 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006897 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.006918 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.007049 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:44.009620 master-0 kubenswrapper[23041]: I0308 00:52:44.007062 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60432080-f735-4274-970e-58d2fa71550f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:44.010803 master-0 kubenswrapper[23041]: I0308 00:52:44.010769 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.011216 master-0 kubenswrapper[23041]: I0308 00:52:44.011173 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.011590 master-0 kubenswrapper[23041]: I0308 00:52:44.011551 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.021686 master-0 kubenswrapper[23041]: I0308 00:52:44.016541 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.021686 master-0 kubenswrapper[23041]: I0308 00:52:44.021169 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.030928 master-0 kubenswrapper[23041]: I0308 00:52:44.030874 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.031423 master-0 kubenswrapper[23041]: I0308 00:52:44.031383 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:52:44.031599 master-0 kubenswrapper[23041]: I0308 00:52:44.031569 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7937ab9a4c8d614dfdb5fc98362cfde4f447c9044ef1b15cf0facb998bc5a885/globalmount\"" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.035218 master-0 kubenswrapper[23041]: I0308 00:52:44.032987 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvm49\" (UniqueName: \"kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:44.235332 master-0 kubenswrapper[23041]: I0308 00:52:44.235284 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:44.284314 master-0 kubenswrapper[23041]: I0308 00:52:44.283183 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5584778f8f-xqg9r"] Mar 08 00:52:44.673549 master-0 kubenswrapper[23041]: I0308 00:52:44.673498 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:44.890528 master-0 kubenswrapper[23041]: I0308 00:52:44.879448 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:44.890528 master-0 kubenswrapper[23041]: I0308 00:52:44.884446 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60432080-f735-4274-970e-58d2fa71550f" path="/var/lib/kubelet/pods/60432080-f735-4274-970e-58d2fa71550f/volumes" Mar 08 00:52:44.983524 master-0 kubenswrapper[23041]: I0308 00:52:44.980646 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:44.990112 master-0 kubenswrapper[23041]: I0308 00:52:44.990026 23041 generic.go:334] "Generic (PLEG): container finished" podID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerID="6378cbc250d89163b19065cb86a7c55b962ad4b72fb574257d448043fb441b78" exitCode=0 Mar 08 00:52:44.992429 master-0 kubenswrapper[23041]: I0308 00:52:44.992370 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" event={"ID":"4528bfc0-76dc-47be-b4e0-cfddeb378c94","Type":"ContainerDied","Data":"6378cbc250d89163b19065cb86a7c55b962ad4b72fb574257d448043fb441b78"} Mar 08 00:52:45.198801 master-0 kubenswrapper[23041]: I0308 00:52:45.198537 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:45.200142 master-0 kubenswrapper[23041]: E0308 00:52:45.199313 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-1280f-default-internal-api-0" podUID="329d163a-757d-4f6d-8497-31129004711b" Mar 08 00:52:45.812157 master-0 kubenswrapper[23041]: I0308 00:52:45.812103 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:45.859865 master-0 kubenswrapper[23041]: I0308 00:52:45.857234 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts\") pod \"74432df1-6a53-4258-932b-6e6aa6c23448\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " Mar 08 00:52:45.859865 master-0 kubenswrapper[23041]: I0308 00:52:45.857349 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8f8q\" (UniqueName: \"kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q\") pod \"74432df1-6a53-4258-932b-6e6aa6c23448\" (UID: \"74432df1-6a53-4258-932b-6e6aa6c23448\") " Mar 08 00:52:45.864507 master-0 kubenswrapper[23041]: I0308 00:52:45.861150 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q" (OuterVolumeSpecName: "kube-api-access-n8f8q") pod "74432df1-6a53-4258-932b-6e6aa6c23448" (UID: "74432df1-6a53-4258-932b-6e6aa6c23448"). InnerVolumeSpecName "kube-api-access-n8f8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:45.864824 master-0 kubenswrapper[23041]: I0308 00:52:45.864764 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74432df1-6a53-4258-932b-6e6aa6c23448" (UID: "74432df1-6a53-4258-932b-6e6aa6c23448"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:45.960274 master-0 kubenswrapper[23041]: I0308 00:52:45.960178 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74432df1-6a53-4258-932b-6e6aa6c23448-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:45.960274 master-0 kubenswrapper[23041]: I0308 00:52:45.960241 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8f8q\" (UniqueName: \"kubernetes.io/projected/74432df1-6a53-4258-932b-6e6aa6c23448-kube-api-access-n8f8q\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:45.970731 master-0 kubenswrapper[23041]: I0308 00:52:45.970665 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:46.010319 master-0 kubenswrapper[23041]: I0308 00:52:46.001528 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-39c2-account-create-update-jbfp7" event={"ID":"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd","Type":"ContainerDied","Data":"db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a"} Mar 08 00:52:46.010319 master-0 kubenswrapper[23041]: I0308 00:52:46.001574 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db58e5c4af71bfba2e72b206c15c055785b1bb8ca3237a3a2ae052bbdda6cb0a" Mar 08 00:52:46.010319 master-0 kubenswrapper[23041]: I0308 00:52:46.001637 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-39c2-account-create-update-jbfp7" Mar 08 00:52:46.019708 master-0 kubenswrapper[23041]: I0308 00:52:46.018517 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9pxq8" event={"ID":"74432df1-6a53-4258-932b-6e6aa6c23448","Type":"ContainerDied","Data":"c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d"} Mar 08 00:52:46.019708 master-0 kubenswrapper[23041]: I0308 00:52:46.018569 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b2b9f15bd9e6a1e00d15e7902c6a5ef60366237832804a93618f8f422a336d" Mar 08 00:52:46.019708 master-0 kubenswrapper[23041]: I0308 00:52:46.018642 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9pxq8" Mar 08 00:52:46.035566 master-0 kubenswrapper[23041]: I0308 00:52:46.034521 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:46.058474 master-0 kubenswrapper[23041]: I0308 00:52:46.058377 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:46.059230 master-0 kubenswrapper[23041]: I0308 00:52:46.059157 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" event={"ID":"4528bfc0-76dc-47be-b4e0-cfddeb378c94","Type":"ContainerStarted","Data":"d2ef1dc98f27539292343db5442637e3e74e48cd923adeb67381f39e1874f2d9"} Mar 08 00:52:46.060619 master-0 kubenswrapper[23041]: I0308 00:52:46.060491 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:46.061473 master-0 kubenswrapper[23041]: I0308 00:52:46.061439 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48kw\" (UniqueName: \"kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw\") pod \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " Mar 08 00:52:46.061779 master-0 kubenswrapper[23041]: I0308 00:52:46.061739 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts\") pod \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\" (UID: \"cfaf8fd1-f241-4f4b-ab62-4d04cef718dd\") " Mar 08 00:52:46.064070 master-0 kubenswrapper[23041]: I0308 00:52:46.063959 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" (UID: "cfaf8fd1-f241-4f4b-ab62-4d04cef718dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:46.080826 master-0 kubenswrapper[23041]: I0308 00:52:46.080769 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw" (OuterVolumeSpecName: "kube-api-access-m48kw") pod "cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" (UID: "cfaf8fd1-f241-4f4b-ab62-4d04cef718dd"). InnerVolumeSpecName "kube-api-access-m48kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:46.082360 master-0 kubenswrapper[23041]: I0308 00:52:46.082326 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48kw\" (UniqueName: \"kubernetes.io/projected/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-kube-api-access-m48kw\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.083015 master-0 kubenswrapper[23041]: I0308 00:52:46.082998 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.094291 master-0 kubenswrapper[23041]: I0308 00:52:46.091450 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:46.104189 master-0 kubenswrapper[23041]: I0308 00:52:46.104098 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" podStartSLOduration=5.104071474 podStartE2EDuration="5.104071474s" podCreationTimestamp="2026-03-08 00:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:46.08713685 +0000 UTC m=+1271.559973404" watchObservedRunningTime="2026-03-08 00:52:46.104071474 +0000 UTC m=+1271.576908028" Mar 08 00:52:46.111392 master-0 kubenswrapper[23041]: I0308 00:52:46.111354 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:46.187665 master-0 kubenswrapper[23041]: I0308 00:52:46.187594 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.187882 master-0 kubenswrapper[23041]: I0308 00:52:46.187856 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.187953 master-0 kubenswrapper[23041]: I0308 00:52:46.187911 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.187992 master-0 kubenswrapper[23041]: I0308 00:52:46.187960 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.188156 master-0 kubenswrapper[23041]: I0308 00:52:46.188090 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.188219 master-0 kubenswrapper[23041]: I0308 00:52:46.188157 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.188219 master-0 kubenswrapper[23041]: I0308 00:52:46.188191 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.188292 master-0 kubenswrapper[23041]: I0308 00:52:46.188236 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvm49\" (UniqueName: \"kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49\") pod \"329d163a-757d-4f6d-8497-31129004711b\" (UID: \"329d163a-757d-4f6d-8497-31129004711b\") " Mar 08 00:52:46.189533 master-0 kubenswrapper[23041]: I0308 00:52:46.189351 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:52:46.189871 master-0 kubenswrapper[23041]: I0308 00:52:46.189835 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs" (OuterVolumeSpecName: "logs") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:52:46.193269 master-0 kubenswrapper[23041]: I0308 00:52:46.193236 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:46.194522 master-0 kubenswrapper[23041]: I0308 00:52:46.194465 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:46.194522 master-0 kubenswrapper[23041]: I0308 00:52:46.194499 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data" (OuterVolumeSpecName: "config-data") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:46.194938 master-0 kubenswrapper[23041]: I0308 00:52:46.194907 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts" (OuterVolumeSpecName: "scripts") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:46.196977 master-0 kubenswrapper[23041]: I0308 00:52:46.196946 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49" (OuterVolumeSpecName: "kube-api-access-dvm49") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "kube-api-access-dvm49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:46.240332 master-0 kubenswrapper[23041]: I0308 00:52:46.234716 23041 trace.go:236] Trace[970655528]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (08-Mar-2026 00:52:44.730) (total time: 1504ms): Mar 08 00:52:46.240332 master-0 kubenswrapper[23041]: Trace[970655528]: [1.504106585s] [1.504106585s] END Mar 08 00:52:46.262847 master-0 kubenswrapper[23041]: I0308 00:52:46.262725 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e" (OuterVolumeSpecName: "glance") pod "329d163a-757d-4f6d-8497-31129004711b" (UID: "329d163a-757d-4f6d-8497-31129004711b"). InnerVolumeSpecName "pvc-7512aa1f-2488-47af-b61f-945377082816". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:52:46.290739 master-0 kubenswrapper[23041]: I0308 00:52:46.290675 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290739 master-0 kubenswrapper[23041]: I0308 00:52:46.290719 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290759 23041 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") on node \"master-0\" " Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290774 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290785 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290795 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvm49\" (UniqueName: \"kubernetes.io/projected/329d163a-757d-4f6d-8497-31129004711b-kube-api-access-dvm49\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290805 23041 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329d163a-757d-4f6d-8497-31129004711b-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.290983 master-0 kubenswrapper[23041]: I0308 00:52:46.290816 23041 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/329d163a-757d-4f6d-8497-31129004711b-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:46.316441 master-0 kubenswrapper[23041]: I0308 00:52:46.316390 23041 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 00:52:46.316691 master-0 kubenswrapper[23041]: I0308 00:52:46.316604 23041 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7512aa1f-2488-47af-b61f-945377082816" (UniqueName: "kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e") on node "master-0" Mar 08 00:52:46.393160 master-0 kubenswrapper[23041]: I0308 00:52:46.393099 23041 reconciler_common.go:293] "Volume detached for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:47.082707 master-0 kubenswrapper[23041]: I0308 00:52:47.082651 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerStarted","Data":"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730"} Mar 08 00:52:47.082707 master-0 kubenswrapper[23041]: I0308 00:52:47.082711 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerStarted","Data":"96d5377d3e7d7bb223ab6fb583c109cc491eeb445b24a6f227adcd088e5539fb"} Mar 08 00:52:47.082707 master-0 kubenswrapper[23041]: I0308 00:52:47.082678 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.148381 master-0 kubenswrapper[23041]: I0308 00:52:47.146758 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:47.206375 master-0 kubenswrapper[23041]: I0308 00:52:47.204468 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:47.218634 master-0 kubenswrapper[23041]: I0308 00:52:47.218142 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:47.219534 master-0 kubenswrapper[23041]: E0308 00:52:47.218952 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" containerName="mariadb-account-create-update" Mar 08 00:52:47.219534 master-0 kubenswrapper[23041]: I0308 00:52:47.218973 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" containerName="mariadb-account-create-update" Mar 08 00:52:47.219534 master-0 kubenswrapper[23041]: E0308 00:52:47.218990 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74432df1-6a53-4258-932b-6e6aa6c23448" containerName="mariadb-database-create" Mar 08 00:52:47.219534 master-0 kubenswrapper[23041]: I0308 00:52:47.219000 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="74432df1-6a53-4258-932b-6e6aa6c23448" containerName="mariadb-database-create" Mar 08 00:52:47.220342 master-0 kubenswrapper[23041]: I0308 00:52:47.220301 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" containerName="mariadb-account-create-update" Mar 08 00:52:47.220342 master-0 kubenswrapper[23041]: I0308 00:52:47.220341 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="74432df1-6a53-4258-932b-6e6aa6c23448" containerName="mariadb-database-create" Mar 08 00:52:47.225370 master-0 kubenswrapper[23041]: I0308 00:52:47.225278 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:47.225924 master-0 kubenswrapper[23041]: I0308 00:52:47.225858 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.232272 master-0 kubenswrapper[23041]: I0308 00:52:47.232160 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-internal-config-data" Mar 08 00:52:47.243429 master-0 kubenswrapper[23041]: I0308 00:52:47.243372 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 00:52:47.315946 master-0 kubenswrapper[23041]: I0308 00:52:47.315872 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9vh\" (UniqueName: \"kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.315946 master-0 kubenswrapper[23041]: I0308 00:52:47.315944 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316279 master-0 kubenswrapper[23041]: I0308 00:52:47.316046 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316279 master-0 kubenswrapper[23041]: I0308 00:52:47.316088 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316279 master-0 kubenswrapper[23041]: I0308 00:52:47.316171 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316279 master-0 kubenswrapper[23041]: I0308 00:52:47.316257 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316479 master-0 kubenswrapper[23041]: I0308 00:52:47.316300 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.316479 master-0 kubenswrapper[23041]: I0308 00:52:47.316352 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421251 master-0 kubenswrapper[23041]: I0308 00:52:47.421098 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9vh\" (UniqueName: \"kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421251 master-0 kubenswrapper[23041]: I0308 00:52:47.421163 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421262 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421293 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421355 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421400 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421430 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.421496 master-0 kubenswrapper[23041]: I0308 00:52:47.421467 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.422604 master-0 kubenswrapper[23041]: I0308 00:52:47.422316 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.422604 master-0 kubenswrapper[23041]: I0308 00:52:47.422360 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.425072 master-0 kubenswrapper[23041]: I0308 00:52:47.424895 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:52:47.425072 master-0 kubenswrapper[23041]: I0308 00:52:47.424946 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7937ab9a4c8d614dfdb5fc98362cfde4f447c9044ef1b15cf0facb998bc5a885/globalmount\"" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.433322 master-0 kubenswrapper[23041]: I0308 00:52:47.432983 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.433459 master-0 kubenswrapper[23041]: I0308 00:52:47.433320 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.433459 master-0 kubenswrapper[23041]: I0308 00:52:47.433045 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.435059 master-0 kubenswrapper[23041]: I0308 00:52:47.435032 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:47.437622 master-0 kubenswrapper[23041]: I0308 00:52:47.437580 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9vh\" (UniqueName: \"kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:48.778099 master-0 kubenswrapper[23041]: I0308 00:52:48.778042 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:48.830597 master-0 kubenswrapper[23041]: I0308 00:52:48.830456 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329d163a-757d-4f6d-8497-31129004711b" path="/var/lib/kubelet/pods/329d163a-757d-4f6d-8497-31129004711b/volumes" Mar 08 00:52:49.056238 master-0 kubenswrapper[23041]: I0308 00:52:49.056157 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:49.877434 master-0 kubenswrapper[23041]: I0308 00:52:49.870867 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-hxms8"] Mar 08 00:52:49.877434 master-0 kubenswrapper[23041]: I0308 00:52:49.873316 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.877833 master-0 kubenswrapper[23041]: I0308 00:52:49.877493 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 08 00:52:49.877833 master-0 kubenswrapper[23041]: I0308 00:52:49.877727 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.882697 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.882822 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.882903 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2dr\" (UniqueName: \"kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.883021 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.883501 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.885342 master-0 kubenswrapper[23041]: I0308 00:52:49.883857 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.899481 master-0 kubenswrapper[23041]: I0308 00:52:49.898972 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-hxms8"] Mar 08 00:52:49.986459 master-0 kubenswrapper[23041]: I0308 00:52:49.986234 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.986459 master-0 kubenswrapper[23041]: I0308 00:52:49.986330 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.986459 master-0 kubenswrapper[23041]: I0308 00:52:49.986364 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.986459 master-0 kubenswrapper[23041]: I0308 00:52:49.986403 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.986756 master-0 kubenswrapper[23041]: I0308 00:52:49.986468 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.986756 master-0 kubenswrapper[23041]: I0308 00:52:49.986508 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2dr\" (UniqueName: \"kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.988165 master-0 kubenswrapper[23041]: I0308 00:52:49.988132 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:49.991694 master-0 kubenswrapper[23041]: I0308 00:52:49.991665 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:50.004413 master-0 kubenswrapper[23041]: I0308 00:52:50.004356 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:50.020621 master-0 kubenswrapper[23041]: I0308 00:52:50.020555 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:52:50.023651 master-0 kubenswrapper[23041]: I0308 00:52:50.023612 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2dr\" (UniqueName: \"kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:50.026849 master-0 kubenswrapper[23041]: I0308 00:52:50.026707 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:50.027848 master-0 kubenswrapper[23041]: I0308 00:52:50.027814 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts\") pod \"ironic-db-sync-hxms8\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:50.031908 master-0 kubenswrapper[23041]: W0308 00:52:50.031728 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81e602d_26e5_49ac_92d1_71fea2607868.slice/crio-335191ee8e87e94345fbe2487bf145b34d18bafafb0ac0996779eacbdf40dcdc WatchSource:0}: Error finding container 335191ee8e87e94345fbe2487bf145b34d18bafafb0ac0996779eacbdf40dcdc: Status 404 returned error can't find the container with id 335191ee8e87e94345fbe2487bf145b34d18bafafb0ac0996779eacbdf40dcdc Mar 08 00:52:50.143753 master-0 kubenswrapper[23041]: I0308 00:52:50.143708 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8rrvj" event={"ID":"88a16b7d-3a7c-4b23-9f7c-448fea1247e1","Type":"ContainerStarted","Data":"9cb4492d48ba3f747baf51a9b4d5267fa579cb8b0df1b847cd2005bb1f238a28"} Mar 08 00:52:50.149308 master-0 kubenswrapper[23041]: I0308 00:52:50.149255 23041 generic.go:334] "Generic (PLEG): container finished" podID="8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" containerID="b61989e3130b4a8de0150f5c936dfb8c14d3db55340921bd2a521eb497b101dd" exitCode=0 Mar 08 00:52:50.149473 master-0 kubenswrapper[23041]: I0308 00:52:50.149336 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4czgd" event={"ID":"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed","Type":"ContainerDied","Data":"b61989e3130b4a8de0150f5c936dfb8c14d3db55340921bd2a521eb497b101dd"} Mar 08 00:52:50.151945 master-0 kubenswrapper[23041]: I0308 00:52:50.151909 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerStarted","Data":"335191ee8e87e94345fbe2487bf145b34d18bafafb0ac0996779eacbdf40dcdc"} Mar 08 00:52:50.170118 master-0 kubenswrapper[23041]: I0308 00:52:50.170015 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8rrvj" podStartSLOduration=4.490643154 podStartE2EDuration="11.169991619s" podCreationTimestamp="2026-03-08 00:52:39 +0000 UTC" firstStartedPulling="2026-03-08 00:52:42.932985186 +0000 UTC m=+1268.405821740" lastFinishedPulling="2026-03-08 00:52:49.612333651 +0000 UTC m=+1275.085170205" observedRunningTime="2026-03-08 00:52:50.165275133 +0000 UTC m=+1275.638111687" watchObservedRunningTime="2026-03-08 00:52:50.169991619 +0000 UTC m=+1275.642828183" Mar 08 00:52:50.194666 master-0 kubenswrapper[23041]: I0308 00:52:50.194534 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-hxms8" Mar 08 00:52:51.078007 master-0 kubenswrapper[23041]: I0308 00:52:51.077957 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-hxms8"] Mar 08 00:52:51.197672 master-0 kubenswrapper[23041]: I0308 00:52:51.197420 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerStarted","Data":"952aad800122f0c5297b7769b87af95cfeabc1e5b270679a6f4cf94801ac2b3f"} Mar 08 00:52:51.206838 master-0 kubenswrapper[23041]: I0308 00:52:51.206777 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerStarted","Data":"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7"} Mar 08 00:52:51.207117 master-0 kubenswrapper[23041]: I0308 00:52:51.206837 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-external-api-0" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-log" containerID="cri-o://1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" gracePeriod=30 Mar 08 00:52:51.207445 master-0 kubenswrapper[23041]: I0308 00:52:51.207393 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-external-api-0" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-httpd" containerID="cri-o://223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" gracePeriod=30 Mar 08 00:52:51.210106 master-0 kubenswrapper[23041]: I0308 00:52:51.210049 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerStarted","Data":"efc239c5bde71c1bf2cf30fb2848c3d9f4a95f4499c6738d97baa20a177ada96"} Mar 08 00:52:51.280604 master-0 kubenswrapper[23041]: I0308 00:52:51.280490 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1280f-default-external-api-0" podStartSLOduration=13.280461557 podStartE2EDuration="13.280461557s" podCreationTimestamp="2026-03-08 00:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:51.247641215 +0000 UTC m=+1276.720477769" watchObservedRunningTime="2026-03-08 00:52:51.280461557 +0000 UTC m=+1276.753298141" Mar 08 00:52:51.810438 master-0 kubenswrapper[23041]: I0308 00:52:51.809484 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:52.024078 master-0 kubenswrapper[23041]: I0308 00:52:52.024014 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.024078 master-0 kubenswrapper[23041]: I0308 00:52:52.024080 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.024278 master-0 kubenswrapper[23041]: I0308 00:52:52.024240 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.024361 master-0 kubenswrapper[23041]: I0308 00:52:52.024331 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.024491 master-0 kubenswrapper[23041]: I0308 00:52:52.024438 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.024573 master-0 kubenswrapper[23041]: I0308 00:52:52.024551 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqtqx\" (UniqueName: \"kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx\") pod \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\" (UID: \"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed\") " Mar 08 00:52:52.029530 master-0 kubenswrapper[23041]: I0308 00:52:52.029483 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.029628 master-0 kubenswrapper[23041]: I0308 00:52:52.029528 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.030273 master-0 kubenswrapper[23041]: I0308 00:52:52.030179 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx" (OuterVolumeSpecName: "kube-api-access-dqtqx") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "kube-api-access-dqtqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:52.031829 master-0 kubenswrapper[23041]: I0308 00:52:52.031768 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts" (OuterVolumeSpecName: "scripts") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.057836 master-0 kubenswrapper[23041]: I0308 00:52:52.057534 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.076873 master-0 kubenswrapper[23041]: I0308 00:52:52.076806 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data" (OuterVolumeSpecName: "config-data") pod "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" (UID: "8d4a0f9d-e6ed-4676-8a30-350a4a6907ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127746 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127798 23041 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127811 23041 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127819 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127828 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.127810 master-0 kubenswrapper[23041]: I0308 00:52:52.127839 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqtqx\" (UniqueName: \"kubernetes.io/projected/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed-kube-api-access-dqtqx\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.174980 master-0 kubenswrapper[23041]: I0308 00:52:52.174808 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.228749 master-0 kubenswrapper[23041]: I0308 00:52:52.228698 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzgvv\" (UniqueName: \"kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.229068 master-0 kubenswrapper[23041]: I0308 00:52:52.228780 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.229068 master-0 kubenswrapper[23041]: I0308 00:52:52.228876 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.229068 master-0 kubenswrapper[23041]: I0308 00:52:52.229068 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.234190 master-0 kubenswrapper[23041]: I0308 00:52:52.234027 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.234190 master-0 kubenswrapper[23041]: I0308 00:52:52.234075 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.234190 master-0 kubenswrapper[23041]: I0308 00:52:52.234109 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.234190 master-0 kubenswrapper[23041]: I0308 00:52:52.234155 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs\") pod \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\" (UID: \"ee2811c7-2c79-45af-afe7-5f3b9ad28474\") " Mar 08 00:52:52.234429 master-0 kubenswrapper[23041]: I0308 00:52:52.234292 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv" (OuterVolumeSpecName: "kube-api-access-xzgvv") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "kube-api-access-xzgvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:52.234701 master-0 kubenswrapper[23041]: I0308 00:52:52.234676 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:52:52.234964 master-0 kubenswrapper[23041]: I0308 00:52:52.234927 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzgvv\" (UniqueName: \"kubernetes.io/projected/ee2811c7-2c79-45af-afe7-5f3b9ad28474-kube-api-access-xzgvv\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.234964 master-0 kubenswrapper[23041]: I0308 00:52:52.234951 23041 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.235520 master-0 kubenswrapper[23041]: I0308 00:52:52.235494 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs" (OuterVolumeSpecName: "logs") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:52:52.235666 master-0 kubenswrapper[23041]: I0308 00:52:52.235614 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerStarted","Data":"052793deb444ddfa766fae4a335f25279de72c0d02a2316a2e213e56bed30759"} Mar 08 00:52:52.251625 master-0 kubenswrapper[23041]: I0308 00:52:52.251581 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4czgd" Mar 08 00:52:52.253748 master-0 kubenswrapper[23041]: I0308 00:52:52.253703 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4czgd" event={"ID":"8d4a0f9d-e6ed-4676-8a30-350a4a6907ed","Type":"ContainerDied","Data":"fed90c97670ca2ad7411d626257c615e08871169fb088c56384207d15d40dd4a"} Mar 08 00:52:52.253748 master-0 kubenswrapper[23041]: I0308 00:52:52.253741 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed90c97670ca2ad7411d626257c615e08871169fb088c56384207d15d40dd4a" Mar 08 00:52:52.257766 master-0 kubenswrapper[23041]: I0308 00:52:52.257715 23041 generic.go:334] "Generic (PLEG): container finished" podID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerID="223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" exitCode=0 Mar 08 00:52:52.257861 master-0 kubenswrapper[23041]: I0308 00:52:52.257785 23041 generic.go:334] "Generic (PLEG): container finished" podID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerID="1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" exitCode=143 Mar 08 00:52:52.257861 master-0 kubenswrapper[23041]: I0308 00:52:52.257814 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerDied","Data":"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7"} Mar 08 00:52:52.257861 master-0 kubenswrapper[23041]: I0308 00:52:52.257840 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerDied","Data":"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730"} Mar 08 00:52:52.257861 master-0 kubenswrapper[23041]: I0308 00:52:52.257854 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"ee2811c7-2c79-45af-afe7-5f3b9ad28474","Type":"ContainerDied","Data":"96d5377d3e7d7bb223ab6fb583c109cc491eeb445b24a6f227adcd088e5539fb"} Mar 08 00:52:52.257861 master-0 kubenswrapper[23041]: I0308 00:52:52.257853 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.258502 master-0 kubenswrapper[23041]: I0308 00:52:52.257870 23041 scope.go:117] "RemoveContainer" containerID="223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" Mar 08 00:52:52.258752 master-0 kubenswrapper[23041]: I0308 00:52:52.258531 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts" (OuterVolumeSpecName: "scripts") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.272937 master-0 kubenswrapper[23041]: I0308 00:52:52.271550 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1280f-default-internal-api-0" podStartSLOduration=5.271530147 podStartE2EDuration="5.271530147s" podCreationTimestamp="2026-03-08 00:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:52.264738251 +0000 UTC m=+1277.737574815" watchObservedRunningTime="2026-03-08 00:52:52.271530147 +0000 UTC m=+1277.744366691" Mar 08 00:52:52.277769 master-0 kubenswrapper[23041]: I0308 00:52:52.277722 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf" (OuterVolumeSpecName: "glance") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:52:52.304959 master-0 kubenswrapper[23041]: I0308 00:52:52.304769 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.309884 master-0 kubenswrapper[23041]: I0308 00:52:52.309832 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4czgd"] Mar 08 00:52:52.318347 master-0 kubenswrapper[23041]: I0308 00:52:52.317407 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.324570 master-0 kubenswrapper[23041]: I0308 00:52:52.324354 23041 scope.go:117] "RemoveContainer" containerID="1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" Mar 08 00:52:52.332438 master-0 kubenswrapper[23041]: I0308 00:52:52.332123 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4czgd"] Mar 08 00:52:52.336735 master-0 kubenswrapper[23041]: I0308 00:52:52.336687 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.336735 master-0 kubenswrapper[23041]: I0308 00:52:52.336725 23041 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") on node \"master-0\" " Mar 08 00:52:52.336735 master-0 kubenswrapper[23041]: I0308 00:52:52.336740 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.337016 master-0 kubenswrapper[23041]: I0308 00:52:52.336750 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee2811c7-2c79-45af-afe7-5f3b9ad28474-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.337016 master-0 kubenswrapper[23041]: I0308 00:52:52.336759 23041 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.355360 master-0 kubenswrapper[23041]: I0308 00:52:52.355249 23041 scope.go:117] "RemoveContainer" containerID="223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" Mar 08 00:52:52.355656 master-0 kubenswrapper[23041]: E0308 00:52:52.355617 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7\": container with ID starting with 223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7 not found: ID does not exist" containerID="223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" Mar 08 00:52:52.355724 master-0 kubenswrapper[23041]: I0308 00:52:52.355675 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7"} err="failed to get container status \"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7\": rpc error: code = NotFound desc = could not find container \"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7\": container with ID starting with 223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7 not found: ID does not exist" Mar 08 00:52:52.355724 master-0 kubenswrapper[23041]: I0308 00:52:52.355699 23041 scope.go:117] "RemoveContainer" containerID="1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" Mar 08 00:52:52.360656 master-0 kubenswrapper[23041]: I0308 00:52:52.360622 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data" (OuterVolumeSpecName: "config-data") pod "ee2811c7-2c79-45af-afe7-5f3b9ad28474" (UID: "ee2811c7-2c79-45af-afe7-5f3b9ad28474"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:52:52.362282 master-0 kubenswrapper[23041]: E0308 00:52:52.362027 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730\": container with ID starting with 1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730 not found: ID does not exist" containerID="1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" Mar 08 00:52:52.362282 master-0 kubenswrapper[23041]: I0308 00:52:52.362195 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730"} err="failed to get container status \"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730\": rpc error: code = NotFound desc = could not find container \"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730\": container with ID starting with 1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730 not found: ID does not exist" Mar 08 00:52:52.362428 master-0 kubenswrapper[23041]: I0308 00:52:52.362328 23041 scope.go:117] "RemoveContainer" containerID="223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7" Mar 08 00:52:52.371632 master-0 kubenswrapper[23041]: I0308 00:52:52.366049 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7"} err="failed to get container status \"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7\": rpc error: code = NotFound desc = could not find container \"223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7\": container with ID starting with 223a814ce566c5ff86a0bf900faffed8600bdc98b50feeab2c8684e3ef3bc1f7 not found: ID does not exist" Mar 08 00:52:52.371632 master-0 kubenswrapper[23041]: I0308 00:52:52.366136 23041 scope.go:117] "RemoveContainer" containerID="1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730" Mar 08 00:52:52.371632 master-0 kubenswrapper[23041]: I0308 00:52:52.366928 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730"} err="failed to get container status \"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730\": rpc error: code = NotFound desc = could not find container \"1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730\": container with ID starting with 1e1c07e1955362252af0e37336bd3fc0f44d0401efd52227da40a38816747730 not found: ID does not exist" Mar 08 00:52:52.371632 master-0 kubenswrapper[23041]: I0308 00:52:52.370924 23041 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 00:52:52.371632 master-0 kubenswrapper[23041]: I0308 00:52:52.371071 23041 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223" (UniqueName: "kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf") on node "master-0" Mar 08 00:52:52.372547 master-0 kubenswrapper[23041]: I0308 00:52:52.372523 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:52:52.379394 master-0 kubenswrapper[23041]: I0308 00:52:52.379352 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tfddb"] Mar 08 00:52:52.379874 master-0 kubenswrapper[23041]: E0308 00:52:52.379852 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-httpd" Mar 08 00:52:52.379874 master-0 kubenswrapper[23041]: I0308 00:52:52.379872 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-httpd" Mar 08 00:52:52.379971 master-0 kubenswrapper[23041]: E0308 00:52:52.379920 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-log" Mar 08 00:52:52.379971 master-0 kubenswrapper[23041]: I0308 00:52:52.379928 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-log" Mar 08 00:52:52.379971 master-0 kubenswrapper[23041]: E0308 00:52:52.379967 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" containerName="keystone-bootstrap" Mar 08 00:52:52.380106 master-0 kubenswrapper[23041]: I0308 00:52:52.379977 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" containerName="keystone-bootstrap" Mar 08 00:52:52.380505 master-0 kubenswrapper[23041]: I0308 00:52:52.380454 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-log" Mar 08 00:52:52.380505 master-0 kubenswrapper[23041]: I0308 00:52:52.380490 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" containerName="glance-httpd" Mar 08 00:52:52.380505 master-0 kubenswrapper[23041]: I0308 00:52:52.380505 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" containerName="keystone-bootstrap" Mar 08 00:52:52.382065 master-0 kubenswrapper[23041]: I0308 00:52:52.382015 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.384815 master-0 kubenswrapper[23041]: I0308 00:52:52.383618 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 00:52:52.387958 master-0 kubenswrapper[23041]: I0308 00:52:52.387876 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 00:52:52.389217 master-0 kubenswrapper[23041]: I0308 00:52:52.389172 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 00:52:52.420975 master-0 kubenswrapper[23041]: I0308 00:52:52.419976 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tfddb"] Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446335 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446547 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446607 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446650 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446745 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j89x2\" (UniqueName: \"kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.446830 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.447003 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee2811c7-2c79-45af-afe7-5f3b9ad28474-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.447993 master-0 kubenswrapper[23041]: I0308 00:52:52.447017 23041 reconciler_common.go:293] "Volume detached for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:52.487873 master-0 kubenswrapper[23041]: I0308 00:52:52.482418 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:52.491863 master-0 kubenswrapper[23041]: I0308 00:52:52.490743 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="dnsmasq-dns" containerID="cri-o://725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89" gracePeriod=10 Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.548746 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.550225 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.550312 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.550472 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j89x2\" (UniqueName: \"kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.550548 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553226 master-0 kubenswrapper[23041]: I0308 00:52:52.551126 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.553706 master-0 kubenswrapper[23041]: I0308 00:52:52.553511 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.559261 master-0 kubenswrapper[23041]: I0308 00:52:52.557030 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.559261 master-0 kubenswrapper[23041]: I0308 00:52:52.557095 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.559791 master-0 kubenswrapper[23041]: I0308 00:52:52.559775 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.564280 master-0 kubenswrapper[23041]: I0308 00:52:52.564104 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.572185 master-0 kubenswrapper[23041]: I0308 00:52:52.572134 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j89x2\" (UniqueName: \"kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2\") pod \"keystone-bootstrap-tfddb\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.654298 master-0 kubenswrapper[23041]: I0308 00:52:52.653928 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:52.723907 master-0 kubenswrapper[23041]: I0308 00:52:52.709431 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:52.723907 master-0 kubenswrapper[23041]: I0308 00:52:52.717270 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:52:52.821829 master-0 kubenswrapper[23041]: I0308 00:52:52.821665 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4a0f9d-e6ed-4676-8a30-350a4a6907ed" path="/var/lib/kubelet/pods/8d4a0f9d-e6ed-4676-8a30-350a4a6907ed/volumes" Mar 08 00:52:52.822721 master-0 kubenswrapper[23041]: I0308 00:52:52.822575 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2811c7-2c79-45af-afe7-5f3b9ad28474" path="/var/lib/kubelet/pods/ee2811c7-2c79-45af-afe7-5f3b9ad28474/volumes" Mar 08 00:52:52.825098 master-0 kubenswrapper[23041]: I0308 00:52:52.825062 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:52.826824 master-0 kubenswrapper[23041]: I0308 00:52:52.826794 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.829386 master-0 kubenswrapper[23041]: I0308 00:52:52.829355 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-external-config-data" Mar 08 00:52:52.829597 master-0 kubenswrapper[23041]: I0308 00:52:52.829544 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 00:52:52.917964 master-0 kubenswrapper[23041]: I0308 00:52:52.917919 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:52.969041 master-0 kubenswrapper[23041]: I0308 00:52:52.968989 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.969293 master-0 kubenswrapper[23041]: I0308 00:52:52.969060 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.969293 master-0 kubenswrapper[23041]: I0308 00:52:52.969106 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.969293 master-0 kubenswrapper[23041]: I0308 00:52:52.969140 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.969293 master-0 kubenswrapper[23041]: I0308 00:52:52.969165 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jpm5\" (UniqueName: \"kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.969293 master-0 kubenswrapper[23041]: I0308 00:52:52.969195 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.979870 master-0 kubenswrapper[23041]: I0308 00:52:52.979639 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:52.979870 master-0 kubenswrapper[23041]: I0308 00:52:52.979839 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.082687 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.082787 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083234 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083322 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083536 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jpm5\" (UniqueName: \"kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083630 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083743 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.083981 master-0 kubenswrapper[23041]: I0308 00:52:53.083845 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.084942 master-0 kubenswrapper[23041]: I0308 00:52:53.084373 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.084942 master-0 kubenswrapper[23041]: I0308 00:52:53.084560 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.087986 master-0 kubenswrapper[23041]: I0308 00:52:53.087871 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.088171 master-0 kubenswrapper[23041]: I0308 00:52:53.088133 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.090279 master-0 kubenswrapper[23041]: I0308 00:52:53.090252 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:52:53.090279 master-0 kubenswrapper[23041]: I0308 00:52:53.090294 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/fba811848206921d87eed675e9d53cf2e2311d13264fbd23c3492e9c4520fe29/globalmount\"" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.100405 master-0 kubenswrapper[23041]: I0308 00:52:53.098113 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.100405 master-0 kubenswrapper[23041]: I0308 00:52:53.099226 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.110264 master-0 kubenswrapper[23041]: I0308 00:52:53.109901 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jpm5\" (UniqueName: \"kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:53.260412 master-0 kubenswrapper[23041]: I0308 00:52:53.259315 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:53.303324 master-0 kubenswrapper[23041]: I0308 00:52:53.301735 23041 generic.go:334] "Generic (PLEG): container finished" podID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerID="725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89" exitCode=0 Mar 08 00:52:53.303324 master-0 kubenswrapper[23041]: I0308 00:52:53.302741 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" Mar 08 00:52:53.304511 master-0 kubenswrapper[23041]: I0308 00:52:53.303772 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" event={"ID":"98c1faca-b20d-4243-a40b-da58d311ddf6","Type":"ContainerDied","Data":"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89"} Mar 08 00:52:53.304511 master-0 kubenswrapper[23041]: I0308 00:52:53.303797 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89fcc4dcf-gml6g" event={"ID":"98c1faca-b20d-4243-a40b-da58d311ddf6","Type":"ContainerDied","Data":"031278e5ce1e1295c7dcbbdb4b3968d81cf0909d7a54bbdebee8de642d5d0308"} Mar 08 00:52:53.304511 master-0 kubenswrapper[23041]: I0308 00:52:53.303813 23041 scope.go:117] "RemoveContainer" containerID="725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89" Mar 08 00:52:53.352262 master-0 kubenswrapper[23041]: I0308 00:52:53.352190 23041 scope.go:117] "RemoveContainer" containerID="89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131" Mar 08 00:52:53.372625 master-0 kubenswrapper[23041]: I0308 00:52:53.372539 23041 scope.go:117] "RemoveContainer" containerID="725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89" Mar 08 00:52:53.372922 master-0 kubenswrapper[23041]: E0308 00:52:53.372882 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89\": container with ID starting with 725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89 not found: ID does not exist" containerID="725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89" Mar 08 00:52:53.372964 master-0 kubenswrapper[23041]: I0308 00:52:53.372921 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89"} err="failed to get container status \"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89\": rpc error: code = NotFound desc = could not find container \"725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89\": container with ID starting with 725ec18a38b4b14b393ffca0993fd4a57983922c2c8c38951656c5f15db6cf89 not found: ID does not exist" Mar 08 00:52:53.372964 master-0 kubenswrapper[23041]: I0308 00:52:53.372948 23041 scope.go:117] "RemoveContainer" containerID="89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131" Mar 08 00:52:53.373156 master-0 kubenswrapper[23041]: E0308 00:52:53.373119 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131\": container with ID starting with 89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131 not found: ID does not exist" containerID="89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131" Mar 08 00:52:53.373156 master-0 kubenswrapper[23041]: I0308 00:52:53.373146 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131"} err="failed to get container status \"89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131\": rpc error: code = NotFound desc = could not find container \"89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131\": container with ID starting with 89e3e6fe927b906cff829baa1c4f2c30e2d885a4008e2e5418bb395e0efb2131 not found: ID does not exist" Mar 08 00:52:53.492492 master-0 kubenswrapper[23041]: I0308 00:52:53.492306 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.492492 master-0 kubenswrapper[23041]: I0308 00:52:53.492418 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.492492 master-0 kubenswrapper[23041]: I0308 00:52:53.492465 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.492822 master-0 kubenswrapper[23041]: I0308 00:52:53.492544 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.492822 master-0 kubenswrapper[23041]: I0308 00:52:53.492610 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.492822 master-0 kubenswrapper[23041]: I0308 00:52:53.492639 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmgpt\" (UniqueName: \"kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt\") pod \"98c1faca-b20d-4243-a40b-da58d311ddf6\" (UID: \"98c1faca-b20d-4243-a40b-da58d311ddf6\") " Mar 08 00:52:53.500651 master-0 kubenswrapper[23041]: I0308 00:52:53.499933 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt" (OuterVolumeSpecName: "kube-api-access-rmgpt") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "kube-api-access-rmgpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:52:53.536616 master-0 kubenswrapper[23041]: W0308 00:52:53.536487 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8a390e_7313_41d7_b698_590fb18c5d2d.slice/crio-b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda WatchSource:0}: Error finding container b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda: Status 404 returned error can't find the container with id b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda Mar 08 00:52:53.544732 master-0 kubenswrapper[23041]: I0308 00:52:53.544682 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tfddb"] Mar 08 00:52:53.568070 master-0 kubenswrapper[23041]: I0308 00:52:53.568003 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:53.568070 master-0 kubenswrapper[23041]: I0308 00:52:53.568012 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config" (OuterVolumeSpecName: "config") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:53.576216 master-0 kubenswrapper[23041]: I0308 00:52:53.576147 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:53.577448 master-0 kubenswrapper[23041]: I0308 00:52:53.577408 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:53.590518 master-0 kubenswrapper[23041]: I0308 00:52:53.590482 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98c1faca-b20d-4243-a40b-da58d311ddf6" (UID: "98c1faca-b20d-4243-a40b-da58d311ddf6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:52:53.595641 master-0 kubenswrapper[23041]: I0308 00:52:53.595607 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.595641 master-0 kubenswrapper[23041]: I0308 00:52:53.595634 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.595735 master-0 kubenswrapper[23041]: I0308 00:52:53.595648 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.595735 master-0 kubenswrapper[23041]: I0308 00:52:53.595660 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.595735 master-0 kubenswrapper[23041]: I0308 00:52:53.595669 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c1faca-b20d-4243-a40b-da58d311ddf6-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.595735 master-0 kubenswrapper[23041]: I0308 00:52:53.595679 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmgpt\" (UniqueName: \"kubernetes.io/projected/98c1faca-b20d-4243-a40b-da58d311ddf6-kube-api-access-rmgpt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:52:53.791026 master-0 kubenswrapper[23041]: I0308 00:52:53.790884 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:53.802012 master-0 kubenswrapper[23041]: I0308 00:52:53.801942 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89fcc4dcf-gml6g"] Mar 08 00:52:55.464738 master-0 kubenswrapper[23041]: I0308 00:52:55.086650 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" path="/var/lib/kubelet/pods/98c1faca-b20d-4243-a40b-da58d311ddf6/volumes" Mar 08 00:52:55.464738 master-0 kubenswrapper[23041]: I0308 00:52:55.089057 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tfddb" event={"ID":"db8a390e-7313-41d7-b698-590fb18c5d2d","Type":"ContainerStarted","Data":"fada75281820b21413ba62dd680d903c4fc1e783b8187e1d88837b14cad62616"} Mar 08 00:52:55.464738 master-0 kubenswrapper[23041]: I0308 00:52:55.089085 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tfddb" event={"ID":"db8a390e-7313-41d7-b698-590fb18c5d2d","Type":"ContainerStarted","Data":"b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda"} Mar 08 00:52:55.669355 master-0 kubenswrapper[23041]: I0308 00:52:55.669288 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tfddb" podStartSLOduration=3.669195819 podStartE2EDuration="3.669195819s" podCreationTimestamp="2026-03-08 00:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:52:55.648715858 +0000 UTC m=+1281.121552412" watchObservedRunningTime="2026-03-08 00:52:55.669195819 +0000 UTC m=+1281.142032373" Mar 08 00:52:56.378617 master-0 kubenswrapper[23041]: I0308 00:52:56.378551 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:56.532990 master-0 kubenswrapper[23041]: I0308 00:52:56.532908 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:52:57.116336 master-0 kubenswrapper[23041]: I0308 00:52:57.116276 23041 generic.go:334] "Generic (PLEG): container finished" podID="88a16b7d-3a7c-4b23-9f7c-448fea1247e1" containerID="9cb4492d48ba3f747baf51a9b4d5267fa579cb8b0df1b847cd2005bb1f238a28" exitCode=0 Mar 08 00:52:57.116586 master-0 kubenswrapper[23041]: I0308 00:52:57.116352 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8rrvj" event={"ID":"88a16b7d-3a7c-4b23-9f7c-448fea1247e1","Type":"ContainerDied","Data":"9cb4492d48ba3f747baf51a9b4d5267fa579cb8b0df1b847cd2005bb1f238a28"} Mar 08 00:52:57.121738 master-0 kubenswrapper[23041]: I0308 00:52:57.121684 23041 generic.go:334] "Generic (PLEG): container finished" podID="db8a390e-7313-41d7-b698-590fb18c5d2d" containerID="fada75281820b21413ba62dd680d903c4fc1e783b8187e1d88837b14cad62616" exitCode=0 Mar 08 00:52:57.121985 master-0 kubenswrapper[23041]: I0308 00:52:57.121750 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tfddb" event={"ID":"db8a390e-7313-41d7-b698-590fb18c5d2d","Type":"ContainerDied","Data":"fada75281820b21413ba62dd680d903c4fc1e783b8187e1d88837b14cad62616"} Mar 08 00:52:57.147355 master-0 kubenswrapper[23041]: I0308 00:52:57.147291 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:52:59.057422 master-0 kubenswrapper[23041]: I0308 00:52:59.057360 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:59.057422 master-0 kubenswrapper[23041]: I0308 00:52:59.057406 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:59.089646 master-0 kubenswrapper[23041]: I0308 00:52:59.087255 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:59.102913 master-0 kubenswrapper[23041]: I0308 00:52:59.102152 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:59.149639 master-0 kubenswrapper[23041]: I0308 00:52:59.149575 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:52:59.149639 master-0 kubenswrapper[23041]: I0308 00:52:59.149631 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:53:01.208637 master-0 kubenswrapper[23041]: I0308 00:53:01.208465 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:53:01.208637 master-0 kubenswrapper[23041]: I0308 00:53:01.208592 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:53:01.219672 master-0 kubenswrapper[23041]: I0308 00:53:01.219611 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:53:08.119286 master-0 kubenswrapper[23041]: I0308 00:53:08.114433 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:53:08.121363 master-0 kubenswrapper[23041]: I0308 00:53:08.121211 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8rrvj" Mar 08 00:53:08.184002 master-0 kubenswrapper[23041]: I0308 00:53:08.183950 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.184254 master-0 kubenswrapper[23041]: I0308 00:53:08.184052 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts\") pod \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " Mar 08 00:53:08.184254 master-0 kubenswrapper[23041]: I0308 00:53:08.184155 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs\") pod \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " Mar 08 00:53:08.184329 master-0 kubenswrapper[23041]: I0308 00:53:08.184274 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.184363 master-0 kubenswrapper[23041]: I0308 00:53:08.184352 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.184412 master-0 kubenswrapper[23041]: I0308 00:53:08.184393 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data\") pod \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " Mar 08 00:53:08.184457 master-0 kubenswrapper[23041]: I0308 00:53:08.184441 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lkv8\" (UniqueName: \"kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8\") pod \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " Mar 08 00:53:08.184490 master-0 kubenswrapper[23041]: I0308 00:53:08.184469 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.184522 master-0 kubenswrapper[23041]: I0308 00:53:08.184510 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.184554 master-0 kubenswrapper[23041]: I0308 00:53:08.184532 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle\") pod \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\" (UID: \"88a16b7d-3a7c-4b23-9f7c-448fea1247e1\") " Mar 08 00:53:08.184603 master-0 kubenswrapper[23041]: I0308 00:53:08.184584 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j89x2\" (UniqueName: \"kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2\") pod \"db8a390e-7313-41d7-b698-590fb18c5d2d\" (UID: \"db8a390e-7313-41d7-b698-590fb18c5d2d\") " Mar 08 00:53:08.186512 master-0 kubenswrapper[23041]: I0308 00:53:08.186297 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs" (OuterVolumeSpecName: "logs") pod "88a16b7d-3a7c-4b23-9f7c-448fea1247e1" (UID: "88a16b7d-3a7c-4b23-9f7c-448fea1247e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:08.189063 master-0 kubenswrapper[23041]: I0308 00:53:08.189018 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2" (OuterVolumeSpecName: "kube-api-access-j89x2") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "kube-api-access-j89x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:08.191468 master-0 kubenswrapper[23041]: I0308 00:53:08.191369 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.193319 master-0 kubenswrapper[23041]: I0308 00:53:08.193254 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts" (OuterVolumeSpecName: "scripts") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.193820 master-0 kubenswrapper[23041]: I0308 00:53:08.193773 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.197969 master-0 kubenswrapper[23041]: I0308 00:53:08.197905 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8" (OuterVolumeSpecName: "kube-api-access-9lkv8") pod "88a16b7d-3a7c-4b23-9f7c-448fea1247e1" (UID: "88a16b7d-3a7c-4b23-9f7c-448fea1247e1"). InnerVolumeSpecName "kube-api-access-9lkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:08.207234 master-0 kubenswrapper[23041]: I0308 00:53:08.207163 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts" (OuterVolumeSpecName: "scripts") pod "88a16b7d-3a7c-4b23-9f7c-448fea1247e1" (UID: "88a16b7d-3a7c-4b23-9f7c-448fea1247e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.219354 master-0 kubenswrapper[23041]: I0308 00:53:08.219067 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a16b7d-3a7c-4b23-9f7c-448fea1247e1" (UID: "88a16b7d-3a7c-4b23-9f7c-448fea1247e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.219818 master-0 kubenswrapper[23041]: I0308 00:53:08.219432 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.238028 master-0 kubenswrapper[23041]: I0308 00:53:08.237976 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data" (OuterVolumeSpecName: "config-data") pod "88a16b7d-3a7c-4b23-9f7c-448fea1247e1" (UID: "88a16b7d-3a7c-4b23-9f7c-448fea1247e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.263448 master-0 kubenswrapper[23041]: I0308 00:53:08.263405 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerStarted","Data":"cf01c813d8dc7eaac943bccc02616c5295672f014238fbc58ab73966795d3f0e"} Mar 08 00:53:08.263908 master-0 kubenswrapper[23041]: I0308 00:53:08.263855 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data" (OuterVolumeSpecName: "config-data") pod "db8a390e-7313-41d7-b698-590fb18c5d2d" (UID: "db8a390e-7313-41d7-b698-590fb18c5d2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:08.265706 master-0 kubenswrapper[23041]: I0308 00:53:08.265673 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8rrvj" event={"ID":"88a16b7d-3a7c-4b23-9f7c-448fea1247e1","Type":"ContainerDied","Data":"84166c7b8a1bea673bc799b61ba0f0a536fa212c315c56d2df5d9f316f477717"} Mar 08 00:53:08.265706 master-0 kubenswrapper[23041]: I0308 00:53:08.265706 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84166c7b8a1bea673bc799b61ba0f0a536fa212c315c56d2df5d9f316f477717" Mar 08 00:53:08.265825 master-0 kubenswrapper[23041]: I0308 00:53:08.265677 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8rrvj" Mar 08 00:53:08.267907 master-0 kubenswrapper[23041]: I0308 00:53:08.267857 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tfddb" Mar 08 00:53:08.268030 master-0 kubenswrapper[23041]: I0308 00:53:08.267987 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tfddb" event={"ID":"db8a390e-7313-41d7-b698-590fb18c5d2d","Type":"ContainerDied","Data":"b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda"} Mar 08 00:53:08.268070 master-0 kubenswrapper[23041]: I0308 00:53:08.268038 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b77f5b7ac28fb7915c2deee32b50276e360e108ea524700e4e7d9769717ebbda" Mar 08 00:53:08.287411 master-0 kubenswrapper[23041]: I0308 00:53:08.287370 23041 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287650 master-0 kubenswrapper[23041]: I0308 00:53:08.287639 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287717 master-0 kubenswrapper[23041]: I0308 00:53:08.287707 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287780 master-0 kubenswrapper[23041]: I0308 00:53:08.287770 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j89x2\" (UniqueName: \"kubernetes.io/projected/db8a390e-7313-41d7-b698-590fb18c5d2d-kube-api-access-j89x2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287850 master-0 kubenswrapper[23041]: I0308 00:53:08.287840 23041 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287912 master-0 kubenswrapper[23041]: I0308 00:53:08.287900 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.287974 master-0 kubenswrapper[23041]: I0308 00:53:08.287963 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.288043 master-0 kubenswrapper[23041]: I0308 00:53:08.288033 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.288104 master-0 kubenswrapper[23041]: I0308 00:53:08.288094 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8a390e-7313-41d7-b698-590fb18c5d2d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.288162 master-0 kubenswrapper[23041]: I0308 00:53:08.288152 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:08.288241 master-0 kubenswrapper[23041]: I0308 00:53:08.288230 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lkv8\" (UniqueName: \"kubernetes.io/projected/88a16b7d-3a7c-4b23-9f7c-448fea1247e1-kube-api-access-9lkv8\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:09.868732 master-0 kubenswrapper[23041]: I0308 00:53:09.868553 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-77c9977ddd-2q2jp"] Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: E0308 00:53:09.869072 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8a390e-7313-41d7-b698-590fb18c5d2d" containerName="keystone-bootstrap" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: I0308 00:53:09.869095 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8a390e-7313-41d7-b698-590fb18c5d2d" containerName="keystone-bootstrap" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: E0308 00:53:09.869135 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="init" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: I0308 00:53:09.869145 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="init" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: E0308 00:53:09.869163 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a16b7d-3a7c-4b23-9f7c-448fea1247e1" containerName="placement-db-sync" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: I0308 00:53:09.869169 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a16b7d-3a7c-4b23-9f7c-448fea1247e1" containerName="placement-db-sync" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: E0308 00:53:09.869226 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="dnsmasq-dns" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: I0308 00:53:09.869234 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="dnsmasq-dns" Mar 08 00:53:09.869453 master-0 kubenswrapper[23041]: I0308 00:53:09.869427 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c1faca-b20d-4243-a40b-da58d311ddf6" containerName="dnsmasq-dns" Mar 08 00:53:09.869740 master-0 kubenswrapper[23041]: I0308 00:53:09.869466 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8a390e-7313-41d7-b698-590fb18c5d2d" containerName="keystone-bootstrap" Mar 08 00:53:09.869740 master-0 kubenswrapper[23041]: I0308 00:53:09.869490 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a16b7d-3a7c-4b23-9f7c-448fea1247e1" containerName="placement-db-sync" Mar 08 00:53:09.870159 master-0 kubenswrapper[23041]: I0308 00:53:09.870125 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.873740 master-0 kubenswrapper[23041]: I0308 00:53:09.873690 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 00:53:09.873891 master-0 kubenswrapper[23041]: I0308 00:53:09.873853 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 00:53:09.874084 master-0 kubenswrapper[23041]: I0308 00:53:09.874051 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 00:53:09.874320 master-0 kubenswrapper[23041]: I0308 00:53:09.874182 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 00:53:09.875134 master-0 kubenswrapper[23041]: I0308 00:53:09.874973 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 00:53:09.919623 master-0 kubenswrapper[23041]: I0308 00:53:09.919542 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-internal-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.919866 master-0 kubenswrapper[23041]: I0308 00:53:09.919634 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-public-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.919866 master-0 kubenswrapper[23041]: I0308 00:53:09.919670 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-credential-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.919866 master-0 kubenswrapper[23041]: I0308 00:53:09.919763 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mvs\" (UniqueName: \"kubernetes.io/projected/f03bc658-603e-4577-9e45-62c36bc250c8-kube-api-access-t2mvs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.919866 master-0 kubenswrapper[23041]: I0308 00:53:09.919831 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-combined-ca-bundle\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.920047 master-0 kubenswrapper[23041]: I0308 00:53:09.919877 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-config-data\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.920047 master-0 kubenswrapper[23041]: I0308 00:53:09.919910 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-fernet-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:09.920047 master-0 kubenswrapper[23041]: I0308 00:53:09.919949 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-scripts\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.021557 master-0 kubenswrapper[23041]: I0308 00:53:10.021335 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-internal-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.021557 master-0 kubenswrapper[23041]: I0308 00:53:10.021401 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-public-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.021557 master-0 kubenswrapper[23041]: I0308 00:53:10.021432 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-credential-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.021557 master-0 kubenswrapper[23041]: I0308 00:53:10.021477 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mvs\" (UniqueName: \"kubernetes.io/projected/f03bc658-603e-4577-9e45-62c36bc250c8-kube-api-access-t2mvs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.021557 master-0 kubenswrapper[23041]: I0308 00:53:10.021527 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-combined-ca-bundle\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.022132 master-0 kubenswrapper[23041]: I0308 00:53:10.021705 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-config-data\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.022132 master-0 kubenswrapper[23041]: I0308 00:53:10.021729 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-fernet-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.022132 master-0 kubenswrapper[23041]: I0308 00:53:10.021756 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-scripts\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.026360 master-0 kubenswrapper[23041]: I0308 00:53:10.026318 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-scripts\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.026840 master-0 kubenswrapper[23041]: I0308 00:53:10.026788 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-internal-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.027520 master-0 kubenswrapper[23041]: I0308 00:53:10.027365 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-combined-ca-bundle\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.028271 master-0 kubenswrapper[23041]: I0308 00:53:10.028174 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-fernet-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.029510 master-0 kubenswrapper[23041]: I0308 00:53:10.029460 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-config-data\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.037153 master-0 kubenswrapper[23041]: I0308 00:53:10.036956 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-credential-keys\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.044260 master-0 kubenswrapper[23041]: I0308 00:53:10.044187 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03bc658-603e-4577-9e45-62c36bc250c8-public-tls-certs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.245654 master-0 kubenswrapper[23041]: I0308 00:53:10.245524 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77c9977ddd-2q2jp"] Mar 08 00:53:10.290981 master-0 kubenswrapper[23041]: I0308 00:53:10.290917 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerStarted","Data":"8a54dabca525307204c0cc51fa2007c5a14407a66fee75ded69b720eb65069b5"} Mar 08 00:53:10.398406 master-0 kubenswrapper[23041]: I0308 00:53:10.398030 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mvs\" (UniqueName: \"kubernetes.io/projected/f03bc658-603e-4577-9e45-62c36bc250c8-kube-api-access-t2mvs\") pod \"keystone-77c9977ddd-2q2jp\" (UID: \"f03bc658-603e-4577-9e45-62c36bc250c8\") " pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.440060 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.442327 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.447457 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.448605 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.448809 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 00:53:10.449494 master-0 kubenswrapper[23041]: I0308 00:53:10.449060 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 00:53:10.489221 master-0 kubenswrapper[23041]: I0308 00:53:10.487309 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:10.511474 master-0 kubenswrapper[23041]: I0308 00:53:10.510861 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:10.536874 master-0 kubenswrapper[23041]: I0308 00:53:10.536809 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.537185 master-0 kubenswrapper[23041]: I0308 00:53:10.537094 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.537550 master-0 kubenswrapper[23041]: I0308 00:53:10.537408 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.537811 master-0 kubenswrapper[23041]: I0308 00:53:10.537673 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrq2\" (UniqueName: \"kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.537886 master-0 kubenswrapper[23041]: I0308 00:53:10.537851 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.537965 master-0 kubenswrapper[23041]: I0308 00:53:10.537937 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.539329 master-0 kubenswrapper[23041]: I0308 00:53:10.538389 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.639992 master-0 kubenswrapper[23041]: I0308 00:53:10.639927 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.640178 master-0 kubenswrapper[23041]: I0308 00:53:10.639999 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.640178 master-0 kubenswrapper[23041]: I0308 00:53:10.640104 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.640476 master-0 kubenswrapper[23041]: I0308 00:53:10.640434 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.641752 master-0 kubenswrapper[23041]: I0308 00:53:10.641710 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.641833 master-0 kubenswrapper[23041]: I0308 00:53:10.641777 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.643032 master-0 kubenswrapper[23041]: I0308 00:53:10.642943 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrq2\" (UniqueName: \"kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.643032 master-0 kubenswrapper[23041]: I0308 00:53:10.642572 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.643905 master-0 kubenswrapper[23041]: I0308 00:53:10.643855 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.645863 master-0 kubenswrapper[23041]: I0308 00:53:10.645817 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.647760 master-0 kubenswrapper[23041]: I0308 00:53:10.647698 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.652409 master-0 kubenswrapper[23041]: I0308 00:53:10.650413 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.670668 master-0 kubenswrapper[23041]: I0308 00:53:10.670619 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrq2\" (UniqueName: \"kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.676865 master-0 kubenswrapper[23041]: I0308 00:53:10.676812 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs\") pod \"placement-76cc655964-lxxvl\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.783703 master-0 kubenswrapper[23041]: I0308 00:53:10.781877 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:10.870259 master-0 kubenswrapper[23041]: I0308 00:53:10.869081 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fd7c7bb8d-6cc8x"] Mar 08 00:53:10.888392 master-0 kubenswrapper[23041]: I0308 00:53:10.888323 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.966853 master-0 kubenswrapper[23041]: I0308 00:53:10.966795 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-logs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.966974 master-0 kubenswrapper[23041]: I0308 00:53:10.966903 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-internal-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.966974 master-0 kubenswrapper[23041]: I0308 00:53:10.966928 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lz5\" (UniqueName: \"kubernetes.io/projected/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-kube-api-access-b9lz5\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.966974 master-0 kubenswrapper[23041]: I0308 00:53:10.966967 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-public-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.967076 master-0 kubenswrapper[23041]: I0308 00:53:10.967002 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-scripts\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.967076 master-0 kubenswrapper[23041]: I0308 00:53:10.967060 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-config-data\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.967142 master-0 kubenswrapper[23041]: I0308 00:53:10.967093 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-combined-ca-bundle\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:10.970348 master-0 kubenswrapper[23041]: I0308 00:53:10.970305 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fd7c7bb8d-6cc8x"] Mar 08 00:53:11.049151 master-0 kubenswrapper[23041]: I0308 00:53:11.049094 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77c9977ddd-2q2jp"] Mar 08 00:53:11.066328 master-0 kubenswrapper[23041]: W0308 00:53:11.066215 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf03bc658_603e_4577_9e45_62c36bc250c8.slice/crio-9f71417b75481f8046e0fda94934ec461bea4cf3eb6816989434139b4ea5a0b3 WatchSource:0}: Error finding container 9f71417b75481f8046e0fda94934ec461bea4cf3eb6816989434139b4ea5a0b3: Status 404 returned error can't find the container with id 9f71417b75481f8046e0fda94934ec461bea4cf3eb6816989434139b4ea5a0b3 Mar 08 00:53:11.070301 master-0 kubenswrapper[23041]: I0308 00:53:11.070259 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-config-data\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070379 master-0 kubenswrapper[23041]: I0308 00:53:11.070346 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-combined-ca-bundle\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070649 master-0 kubenswrapper[23041]: I0308 00:53:11.070435 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-logs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070649 master-0 kubenswrapper[23041]: I0308 00:53:11.070533 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-internal-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070649 master-0 kubenswrapper[23041]: I0308 00:53:11.070560 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lz5\" (UniqueName: \"kubernetes.io/projected/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-kube-api-access-b9lz5\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070649 master-0 kubenswrapper[23041]: I0308 00:53:11.070628 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-public-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.070818 master-0 kubenswrapper[23041]: I0308 00:53:11.070682 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-scripts\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.071117 master-0 kubenswrapper[23041]: I0308 00:53:11.071077 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-logs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.098283 master-0 kubenswrapper[23041]: I0308 00:53:11.098171 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-config-data\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.098283 master-0 kubenswrapper[23041]: I0308 00:53:11.098190 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-public-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.100014 master-0 kubenswrapper[23041]: I0308 00:53:11.099912 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-scripts\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.101330 master-0 kubenswrapper[23041]: I0308 00:53:11.101156 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-combined-ca-bundle\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.101966 master-0 kubenswrapper[23041]: I0308 00:53:11.101867 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lz5\" (UniqueName: \"kubernetes.io/projected/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-kube-api-access-b9lz5\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.112787 master-0 kubenswrapper[23041]: I0308 00:53:11.112742 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5-internal-tls-certs\") pod \"placement-6fd7c7bb8d-6cc8x\" (UID: \"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5\") " pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.230881 master-0 kubenswrapper[23041]: I0308 00:53:11.230761 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:11.332557 master-0 kubenswrapper[23041]: I0308 00:53:11.332491 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:11.338404 master-0 kubenswrapper[23041]: I0308 00:53:11.337563 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerStarted","Data":"fb23bcb3de29cd56ece6d3cdc15a6d92669f6ab087162b1b8b2aa35516d08ebc"} Mar 08 00:53:11.347379 master-0 kubenswrapper[23041]: I0308 00:53:11.346859 23041 generic.go:334] "Generic (PLEG): container finished" podID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerID="8a54dabca525307204c0cc51fa2007c5a14407a66fee75ded69b720eb65069b5" exitCode=0 Mar 08 00:53:11.347379 master-0 kubenswrapper[23041]: I0308 00:53:11.346959 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerDied","Data":"8a54dabca525307204c0cc51fa2007c5a14407a66fee75ded69b720eb65069b5"} Mar 08 00:53:11.349654 master-0 kubenswrapper[23041]: I0308 00:53:11.349580 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77c9977ddd-2q2jp" event={"ID":"f03bc658-603e-4577-9e45-62c36bc250c8","Type":"ContainerStarted","Data":"9f71417b75481f8046e0fda94934ec461bea4cf3eb6816989434139b4ea5a0b3"} Mar 08 00:53:11.781102 master-0 kubenswrapper[23041]: I0308 00:53:11.780820 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fd7c7bb8d-6cc8x"] Mar 08 00:53:12.372250 master-0 kubenswrapper[23041]: I0308 00:53:12.371716 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerStarted","Data":"8ea27f9a76b98f282a103da574cd5f4cf64dd2d1609d9ceb48b98ebe91c7d1de"} Mar 08 00:53:12.375628 master-0 kubenswrapper[23041]: I0308 00:53:12.375583 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fd7c7bb8d-6cc8x" event={"ID":"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5","Type":"ContainerStarted","Data":"7c5a9820ba6b371d24607b80d76e62414af59db01167769752941fc1e8aa620b"} Mar 08 00:53:12.375628 master-0 kubenswrapper[23041]: I0308 00:53:12.375619 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fd7c7bb8d-6cc8x" event={"ID":"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5","Type":"ContainerStarted","Data":"618034a1030a5526d85738fd163ac28d075d24e0990d9f0ea2dd798ccf67c23f"} Mar 08 00:53:12.375628 master-0 kubenswrapper[23041]: I0308 00:53:12.375631 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fd7c7bb8d-6cc8x" event={"ID":"cc50d8b5-fda7-4f99-a8cb-ec20969c1ce5","Type":"ContainerStarted","Data":"4fb0ca789621294367e02d4c0f495c76d69127bb23233cd3b1a65b80d064b2d3"} Mar 08 00:53:12.376562 master-0 kubenswrapper[23041]: I0308 00:53:12.376530 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:12.376620 master-0 kubenswrapper[23041]: I0308 00:53:12.376564 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:12.383421 master-0 kubenswrapper[23041]: I0308 00:53:12.383371 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-db-sync-8zxxl" event={"ID":"309c80e9-6a3a-45cb-93c9-216d39c74f61","Type":"ContainerStarted","Data":"61f55b350bec7b6f23cc9e7373d5dcb07c7b17b7f28524333fcb7b6911059275"} Mar 08 00:53:12.395417 master-0 kubenswrapper[23041]: I0308 00:53:12.394377 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77c9977ddd-2q2jp" event={"ID":"f03bc658-603e-4577-9e45-62c36bc250c8","Type":"ContainerStarted","Data":"2564d5010b41fcc2556ec59c7e2d070e97607724374ebe36d4f3995931a2b034"} Mar 08 00:53:12.395654 master-0 kubenswrapper[23041]: I0308 00:53:12.395627 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:12.397194 master-0 kubenswrapper[23041]: I0308 00:53:12.397140 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerStarted","Data":"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc"} Mar 08 00:53:12.397194 master-0 kubenswrapper[23041]: I0308 00:53:12.397172 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerStarted","Data":"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf"} Mar 08 00:53:12.397194 master-0 kubenswrapper[23041]: I0308 00:53:12.397184 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerStarted","Data":"a24542608e30dbbd4d9c68873ded34f607f87481e14cdcd9d45b5024a584b2ec"} Mar 08 00:53:12.399948 master-0 kubenswrapper[23041]: I0308 00:53:12.397801 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:12.399948 master-0 kubenswrapper[23041]: I0308 00:53:12.397839 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:12.407608 master-0 kubenswrapper[23041]: I0308 00:53:12.406177 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-hxms8" podStartSLOduration=4.773182324 podStartE2EDuration="23.406149397s" podCreationTimestamp="2026-03-08 00:52:49 +0000 UTC" firstStartedPulling="2026-03-08 00:52:51.064864188 +0000 UTC m=+1276.537700742" lastFinishedPulling="2026-03-08 00:53:09.697831261 +0000 UTC m=+1295.170667815" observedRunningTime="2026-03-08 00:53:12.394908352 +0000 UTC m=+1297.867744906" watchObservedRunningTime="2026-03-08 00:53:12.406149397 +0000 UTC m=+1297.878985991" Mar 08 00:53:12.420092 master-0 kubenswrapper[23041]: I0308 00:53:12.410381 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerStarted","Data":"3024bab15b9ea083977d343416d78095bf57369694338550e0140f3e4baef939"} Mar 08 00:53:12.427930 master-0 kubenswrapper[23041]: I0308 00:53:12.427847 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-db-sync-8zxxl" podStartSLOduration=5.528065038 podStartE2EDuration="34.427826806s" podCreationTimestamp="2026-03-08 00:52:38 +0000 UTC" firstStartedPulling="2026-03-08 00:52:41.564661227 +0000 UTC m=+1267.037497771" lastFinishedPulling="2026-03-08 00:53:10.464422985 +0000 UTC m=+1295.937259539" observedRunningTime="2026-03-08 00:53:12.422059986 +0000 UTC m=+1297.894896540" watchObservedRunningTime="2026-03-08 00:53:12.427826806 +0000 UTC m=+1297.900663360" Mar 08 00:53:12.464232 master-0 kubenswrapper[23041]: I0308 00:53:12.460722 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fd7c7bb8d-6cc8x" podStartSLOduration=2.460510785 podStartE2EDuration="2.460510785s" podCreationTimestamp="2026-03-08 00:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:12.447666381 +0000 UTC m=+1297.920502945" watchObservedRunningTime="2026-03-08 00:53:12.460510785 +0000 UTC m=+1297.933347339" Mar 08 00:53:12.558816 master-0 kubenswrapper[23041]: I0308 00:53:12.558642 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-77c9977ddd-2q2jp" podStartSLOduration=3.5586167829999997 podStartE2EDuration="3.558616783s" podCreationTimestamp="2026-03-08 00:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:12.554167574 +0000 UTC m=+1298.027004128" watchObservedRunningTime="2026-03-08 00:53:12.558616783 +0000 UTC m=+1298.031453337" Mar 08 00:53:12.627954 master-0 kubenswrapper[23041]: I0308 00:53:12.626050 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76cc655964-lxxvl" podStartSLOduration=3.62602589 podStartE2EDuration="3.62602589s" podCreationTimestamp="2026-03-08 00:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:12.623726904 +0000 UTC m=+1298.096563458" watchObservedRunningTime="2026-03-08 00:53:12.62602589 +0000 UTC m=+1298.098862454" Mar 08 00:53:12.666937 master-0 kubenswrapper[23041]: I0308 00:53:12.666792 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1280f-default-external-api-0" podStartSLOduration=20.666766936 podStartE2EDuration="20.666766936s" podCreationTimestamp="2026-03-08 00:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:12.661722102 +0000 UTC m=+1298.134558656" watchObservedRunningTime="2026-03-08 00:53:12.666766936 +0000 UTC m=+1298.139603490" Mar 08 00:53:15.448153 master-0 kubenswrapper[23041]: I0308 00:53:15.448087 23041 generic.go:334] "Generic (PLEG): container finished" podID="9ef69671-8e3b-456f-9764-212721fba8e0" containerID="6fa2fc35d099d15db013f4024a180e05dbdce3a40bcd31c527ded344118bf564" exitCode=0 Mar 08 00:53:15.448153 master-0 kubenswrapper[23041]: I0308 00:53:15.448154 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxnnn" event={"ID":"9ef69671-8e3b-456f-9764-212721fba8e0","Type":"ContainerDied","Data":"6fa2fc35d099d15db013f4024a180e05dbdce3a40bcd31c527ded344118bf564"} Mar 08 00:53:16.533901 master-0 kubenswrapper[23041]: I0308 00:53:16.533843 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:16.534663 master-0 kubenswrapper[23041]: I0308 00:53:16.534629 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:16.582615 master-0 kubenswrapper[23041]: I0308 00:53:16.582554 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:16.589065 master-0 kubenswrapper[23041]: I0308 00:53:16.589020 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:16.871773 master-0 kubenswrapper[23041]: I0308 00:53:16.871725 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:53:16.917267 master-0 kubenswrapper[23041]: I0308 00:53:16.917089 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgzxg\" (UniqueName: \"kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg\") pod \"9ef69671-8e3b-456f-9764-212721fba8e0\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " Mar 08 00:53:16.917485 master-0 kubenswrapper[23041]: I0308 00:53:16.917336 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle\") pod \"9ef69671-8e3b-456f-9764-212721fba8e0\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " Mar 08 00:53:16.917485 master-0 kubenswrapper[23041]: I0308 00:53:16.917443 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config\") pod \"9ef69671-8e3b-456f-9764-212721fba8e0\" (UID: \"9ef69671-8e3b-456f-9764-212721fba8e0\") " Mar 08 00:53:16.922766 master-0 kubenswrapper[23041]: I0308 00:53:16.922269 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg" (OuterVolumeSpecName: "kube-api-access-dgzxg") pod "9ef69671-8e3b-456f-9764-212721fba8e0" (UID: "9ef69671-8e3b-456f-9764-212721fba8e0"). InnerVolumeSpecName "kube-api-access-dgzxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:16.949618 master-0 kubenswrapper[23041]: I0308 00:53:16.949545 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config" (OuterVolumeSpecName: "config") pod "9ef69671-8e3b-456f-9764-212721fba8e0" (UID: "9ef69671-8e3b-456f-9764-212721fba8e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:16.959530 master-0 kubenswrapper[23041]: I0308 00:53:16.955356 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ef69671-8e3b-456f-9764-212721fba8e0" (UID: "9ef69671-8e3b-456f-9764-212721fba8e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:17.020299 master-0 kubenswrapper[23041]: I0308 00:53:17.020253 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:17.020299 master-0 kubenswrapper[23041]: I0308 00:53:17.020294 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef69671-8e3b-456f-9764-212721fba8e0-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:17.020542 master-0 kubenswrapper[23041]: I0308 00:53:17.020309 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgzxg\" (UniqueName: \"kubernetes.io/projected/9ef69671-8e3b-456f-9764-212721fba8e0-kube-api-access-dgzxg\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:17.473222 master-0 kubenswrapper[23041]: I0308 00:53:17.473146 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxnnn" event={"ID":"9ef69671-8e3b-456f-9764-212721fba8e0","Type":"ContainerDied","Data":"f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226"} Mar 08 00:53:17.473222 master-0 kubenswrapper[23041]: I0308 00:53:17.473179 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxnnn" Mar 08 00:53:17.473222 master-0 kubenswrapper[23041]: I0308 00:53:17.473219 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f013535cae396e733b3d30a55b68bf4daa9e22dd4f9fced915ddb064b9caf226" Mar 08 00:53:17.473976 master-0 kubenswrapper[23041]: I0308 00:53:17.473438 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:17.473976 master-0 kubenswrapper[23041]: I0308 00:53:17.473454 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:17.808066 master-0 kubenswrapper[23041]: I0308 00:53:17.807981 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:17.808762 master-0 kubenswrapper[23041]: E0308 00:53:17.808678 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef69671-8e3b-456f-9764-212721fba8e0" containerName="neutron-db-sync" Mar 08 00:53:17.808762 master-0 kubenswrapper[23041]: I0308 00:53:17.808700 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef69671-8e3b-456f-9764-212721fba8e0" containerName="neutron-db-sync" Mar 08 00:53:17.809102 master-0 kubenswrapper[23041]: I0308 00:53:17.809076 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef69671-8e3b-456f-9764-212721fba8e0" containerName="neutron-db-sync" Mar 08 00:53:17.811101 master-0 kubenswrapper[23041]: I0308 00:53:17.810682 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:17.856789 master-0 kubenswrapper[23041]: I0308 00:53:17.850486 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:17.973265 master-0 kubenswrapper[23041]: I0308 00:53:17.968727 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:17.973265 master-0 kubenswrapper[23041]: I0308 00:53:17.970671 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:17.974583 master-0 kubenswrapper[23041]: I0308 00:53:17.973968 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 00:53:17.974583 master-0 kubenswrapper[23041]: I0308 00:53:17.974457 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 00:53:17.978240 master-0 kubenswrapper[23041]: I0308 00:53:17.975942 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 00:53:18.002774 master-0 kubenswrapper[23041]: I0308 00:53:18.001847 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:18.003973 master-0 kubenswrapper[23041]: I0308 00:53:18.003908 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.004058 master-0 kubenswrapper[23041]: I0308 00:53:18.003973 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.004177 master-0 kubenswrapper[23041]: I0308 00:53:18.004139 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.004290 master-0 kubenswrapper[23041]: I0308 00:53:18.004181 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.004459 master-0 kubenswrapper[23041]: I0308 00:53:18.004401 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzp8h\" (UniqueName: \"kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.004459 master-0 kubenswrapper[23041]: I0308 00:53:18.004439 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110677 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110758 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf9p2\" (UniqueName: \"kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110852 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110876 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110927 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.110971 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.111011 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzp8h\" (UniqueName: \"kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.111029 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.111064 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.111093 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.111115 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.112167 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.112804 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.113050 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.113548 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.114239 master-0 kubenswrapper[23041]: I0308 00:53:18.113643 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.144242 master-0 kubenswrapper[23041]: I0308 00:53:18.143333 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzp8h\" (UniqueName: \"kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h\") pod \"dnsmasq-dns-6456885d89-8nk8d\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.218402 master-0 kubenswrapper[23041]: I0308 00:53:18.212639 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.218402 master-0 kubenswrapper[23041]: I0308 00:53:18.212771 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.218402 master-0 kubenswrapper[23041]: I0308 00:53:18.212825 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.218402 master-0 kubenswrapper[23041]: I0308 00:53:18.212849 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf9p2\" (UniqueName: \"kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.218402 master-0 kubenswrapper[23041]: I0308 00:53:18.212977 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.223231 master-0 kubenswrapper[23041]: I0308 00:53:18.222092 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.223231 master-0 kubenswrapper[23041]: I0308 00:53:18.222301 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.223231 master-0 kubenswrapper[23041]: I0308 00:53:18.222477 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.234405 master-0 kubenswrapper[23041]: I0308 00:53:18.233469 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.263169 master-0 kubenswrapper[23041]: I0308 00:53:18.263119 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf9p2\" (UniqueName: \"kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2\") pod \"neutron-78756bd8-c6jzz\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.280354 master-0 kubenswrapper[23041]: I0308 00:53:18.280277 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:18.369636 master-0 kubenswrapper[23041]: I0308 00:53:18.367521 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:18.884304 master-0 kubenswrapper[23041]: I0308 00:53:18.884248 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:19.050235 master-0 kubenswrapper[23041]: I0308 00:53:19.046016 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:19.050235 master-0 kubenswrapper[23041]: W0308 00:53:19.047369 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47c3f888_4804_49d5_859a_73983e7c5414.slice/crio-d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7 WatchSource:0}: Error finding container d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7: Status 404 returned error can't find the container with id d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7 Mar 08 00:53:19.518593 master-0 kubenswrapper[23041]: I0308 00:53:19.518506 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerStarted","Data":"755712859b90d8761fbf147c13e124a50c1f56fa9d9c90abcd9e2282903e91a4"} Mar 08 00:53:19.519035 master-0 kubenswrapper[23041]: I0308 00:53:19.519012 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerStarted","Data":"d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7"} Mar 08 00:53:19.521787 master-0 kubenswrapper[23041]: I0308 00:53:19.521733 23041 generic.go:334] "Generic (PLEG): container finished" podID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerID="f832601f1f9584306d99adaf226e4ab085eaf5328dc618255cbcdaae284c01a6" exitCode=0 Mar 08 00:53:19.522063 master-0 kubenswrapper[23041]: I0308 00:53:19.522017 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" event={"ID":"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4","Type":"ContainerDied","Data":"f832601f1f9584306d99adaf226e4ab085eaf5328dc618255cbcdaae284c01a6"} Mar 08 00:53:19.522398 master-0 kubenswrapper[23041]: I0308 00:53:19.522347 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" event={"ID":"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4","Type":"ContainerStarted","Data":"1f963be9d0823e1f4411cc3cb7931975ab6b5178c0f3af05e672949fe4f3f509"} Mar 08 00:53:20.542293 master-0 kubenswrapper[23041]: I0308 00:53:20.542159 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" event={"ID":"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4","Type":"ContainerStarted","Data":"450bb6f5a74724b5126661c98fd0b4b100ff69a645bbd83ee463ca14ee851886"} Mar 08 00:53:20.543980 master-0 kubenswrapper[23041]: I0308 00:53:20.543910 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:20.547224 master-0 kubenswrapper[23041]: I0308 00:53:20.547001 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerStarted","Data":"43f3d316ce033f2e3b7eb9dd2cc9c82219374cedb6454734b3b655f66bc9ce28"} Mar 08 00:53:20.547479 master-0 kubenswrapper[23041]: I0308 00:53:20.547302 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:20.671264 master-0 kubenswrapper[23041]: I0308 00:53:20.671179 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" podStartSLOduration=3.671157708 podStartE2EDuration="3.671157708s" podCreationTimestamp="2026-03-08 00:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:20.668137134 +0000 UTC m=+1306.140973708" watchObservedRunningTime="2026-03-08 00:53:20.671157708 +0000 UTC m=+1306.143994262" Mar 08 00:53:20.711400 master-0 kubenswrapper[23041]: I0308 00:53:20.711290 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78756bd8-c6jzz" podStartSLOduration=3.711264258 podStartE2EDuration="3.711264258s" podCreationTimestamp="2026-03-08 00:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:20.709560446 +0000 UTC m=+1306.182397010" watchObservedRunningTime="2026-03-08 00:53:20.711264258 +0000 UTC m=+1306.184100812" Mar 08 00:53:21.230520 master-0 kubenswrapper[23041]: I0308 00:53:21.230401 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:21.230520 master-0 kubenswrapper[23041]: I0308 00:53:21.230521 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:53:21.252344 master-0 kubenswrapper[23041]: I0308 00:53:21.252282 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:53:21.309874 master-0 kubenswrapper[23041]: I0308 00:53:21.309820 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79bd95bbf9-vglm6"] Mar 08 00:53:21.311702 master-0 kubenswrapper[23041]: I0308 00:53:21.311679 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.317788 master-0 kubenswrapper[23041]: I0308 00:53:21.317731 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 00:53:21.318027 master-0 kubenswrapper[23041]: I0308 00:53:21.317998 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 00:53:21.415415 master-0 kubenswrapper[23041]: I0308 00:53:21.415353 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnjz\" (UniqueName: \"kubernetes.io/projected/9e2f358d-3251-4e23-9126-b33e9f6866c7-kube-api-access-8pnjz\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415616 master-0 kubenswrapper[23041]: I0308 00:53:21.415427 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-internal-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415616 master-0 kubenswrapper[23041]: I0308 00:53:21.415485 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-httpd-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415616 master-0 kubenswrapper[23041]: I0308 00:53:21.415517 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-public-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415616 master-0 kubenswrapper[23041]: I0308 00:53:21.415582 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-ovndb-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415753 master-0 kubenswrapper[23041]: I0308 00:53:21.415618 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.415753 master-0 kubenswrapper[23041]: I0308 00:53:21.415669 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-combined-ca-bundle\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.416990 master-0 kubenswrapper[23041]: I0308 00:53:21.416964 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79bd95bbf9-vglm6"] Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.517826 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-httpd-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.517923 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-public-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.517988 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-ovndb-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.518050 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.518123 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-combined-ca-bundle\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.518250 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnjz\" (UniqueName: \"kubernetes.io/projected/9e2f358d-3251-4e23-9126-b33e9f6866c7-kube-api-access-8pnjz\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.520387 master-0 kubenswrapper[23041]: I0308 00:53:21.518299 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-internal-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.525585 master-0 kubenswrapper[23041]: I0308 00:53:21.525433 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-internal-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.526088 master-0 kubenswrapper[23041]: I0308 00:53:21.526040 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-ovndb-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.527743 master-0 kubenswrapper[23041]: I0308 00:53:21.527486 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-public-tls-certs\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.528184 master-0 kubenswrapper[23041]: I0308 00:53:21.528128 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-combined-ca-bundle\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.539972 master-0 kubenswrapper[23041]: I0308 00:53:21.535095 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-httpd-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.539972 master-0 kubenswrapper[23041]: I0308 00:53:21.538998 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e2f358d-3251-4e23-9126-b33e9f6866c7-config\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.544768 master-0 kubenswrapper[23041]: I0308 00:53:21.544714 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnjz\" (UniqueName: \"kubernetes.io/projected/9e2f358d-3251-4e23-9126-b33e9f6866c7-kube-api-access-8pnjz\") pod \"neutron-79bd95bbf9-vglm6\" (UID: \"9e2f358d-3251-4e23-9126-b33e9f6866c7\") " pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:21.559582 master-0 kubenswrapper[23041]: I0308 00:53:21.559519 23041 generic.go:334] "Generic (PLEG): container finished" podID="309c80e9-6a3a-45cb-93c9-216d39c74f61" containerID="61f55b350bec7b6f23cc9e7373d5dcb07c7b17b7f28524333fcb7b6911059275" exitCode=0 Mar 08 00:53:21.559914 master-0 kubenswrapper[23041]: I0308 00:53:21.559853 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-db-sync-8zxxl" event={"ID":"309c80e9-6a3a-45cb-93c9-216d39c74f61","Type":"ContainerDied","Data":"61f55b350bec7b6f23cc9e7373d5dcb07c7b17b7f28524333fcb7b6911059275"} Mar 08 00:53:21.633094 master-0 kubenswrapper[23041]: I0308 00:53:21.633020 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:22.248957 master-0 kubenswrapper[23041]: I0308 00:53:22.248881 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79bd95bbf9-vglm6"] Mar 08 00:53:22.575295 master-0 kubenswrapper[23041]: I0308 00:53:22.575178 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bd95bbf9-vglm6" event={"ID":"9e2f358d-3251-4e23-9126-b33e9f6866c7","Type":"ContainerStarted","Data":"4f711bba6036c6a2b2f820d8a32b9589ef0f46c9534d74f9bea7b512c8b3951a"} Mar 08 00:53:22.575295 master-0 kubenswrapper[23041]: I0308 00:53:22.575274 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bd95bbf9-vglm6" event={"ID":"9e2f358d-3251-4e23-9126-b33e9f6866c7","Type":"ContainerStarted","Data":"d2e631dd9aea28ad6a822f6fe5ce7795bba081420fdc95e4535a01c6f70c75b0"} Mar 08 00:53:23.014153 master-0 kubenswrapper[23041]: I0308 00:53:23.014108 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.060742 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.060862 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw2j6\" (UniqueName: \"kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.060949 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.061089 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.061147 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.063248 master-0 kubenswrapper[23041]: I0308 00:53:23.061177 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id\") pod \"309c80e9-6a3a-45cb-93c9-216d39c74f61\" (UID: \"309c80e9-6a3a-45cb-93c9-216d39c74f61\") " Mar 08 00:53:23.072701 master-0 kubenswrapper[23041]: I0308 00:53:23.064050 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:23.072701 master-0 kubenswrapper[23041]: I0308 00:53:23.064602 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6" (OuterVolumeSpecName: "kube-api-access-vw2j6") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "kube-api-access-vw2j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:23.072701 master-0 kubenswrapper[23041]: I0308 00:53:23.070448 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts" (OuterVolumeSpecName: "scripts") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:23.107087 master-0 kubenswrapper[23041]: I0308 00:53:23.104698 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:23.120319 master-0 kubenswrapper[23041]: I0308 00:53:23.120261 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:23.131261 master-0 kubenswrapper[23041]: I0308 00:53:23.130553 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data" (OuterVolumeSpecName: "config-data") pod "309c80e9-6a3a-45cb-93c9-216d39c74f61" (UID: "309c80e9-6a3a-45cb-93c9-216d39c74f61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:23.168087 master-0 kubenswrapper[23041]: I0308 00:53:23.168026 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.168087 master-0 kubenswrapper[23041]: I0308 00:53:23.168078 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw2j6\" (UniqueName: \"kubernetes.io/projected/309c80e9-6a3a-45cb-93c9-216d39c74f61-kube-api-access-vw2j6\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.168087 master-0 kubenswrapper[23041]: I0308 00:53:23.168092 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.168419 master-0 kubenswrapper[23041]: I0308 00:53:23.168101 23041 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.168419 master-0 kubenswrapper[23041]: I0308 00:53:23.168109 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c80e9-6a3a-45cb-93c9-216d39c74f61-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.168419 master-0 kubenswrapper[23041]: I0308 00:53:23.168117 23041 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/309c80e9-6a3a-45cb-93c9-216d39c74f61-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:23.590244 master-0 kubenswrapper[23041]: I0308 00:53:23.589659 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bd95bbf9-vglm6" event={"ID":"9e2f358d-3251-4e23-9126-b33e9f6866c7","Type":"ContainerStarted","Data":"a21b4c32831b3272565d20aaf5e875bb792ed4aa099a256bfe9c22279b1a8854"} Mar 08 00:53:23.590766 master-0 kubenswrapper[23041]: I0308 00:53:23.590257 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:23.594271 master-0 kubenswrapper[23041]: I0308 00:53:23.592564 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-db-sync-8zxxl" event={"ID":"309c80e9-6a3a-45cb-93c9-216d39c74f61","Type":"ContainerDied","Data":"8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768"} Mar 08 00:53:23.594271 master-0 kubenswrapper[23041]: I0308 00:53:23.592604 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-db-sync-8zxxl" Mar 08 00:53:23.594271 master-0 kubenswrapper[23041]: I0308 00:53:23.592609 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8badf9b3e6d5150056109623fee8844f39470c396428a26e1e11d10da87b1768" Mar 08 00:53:23.627883 master-0 kubenswrapper[23041]: I0308 00:53:23.625590 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79bd95bbf9-vglm6" podStartSLOduration=2.625565977 podStartE2EDuration="2.625565977s" podCreationTimestamp="2026-03-08 00:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:23.618377042 +0000 UTC m=+1309.091213616" watchObservedRunningTime="2026-03-08 00:53:23.625565977 +0000 UTC m=+1309.098402531" Mar 08 00:53:23.995447 master-0 kubenswrapper[23041]: I0308 00:53:23.992999 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:23.995447 master-0 kubenswrapper[23041]: E0308 00:53:23.993682 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309c80e9-6a3a-45cb-93c9-216d39c74f61" containerName="cinder-675ba-db-sync" Mar 08 00:53:23.995447 master-0 kubenswrapper[23041]: I0308 00:53:23.993697 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="309c80e9-6a3a-45cb-93c9-216d39c74f61" containerName="cinder-675ba-db-sync" Mar 08 00:53:23.995447 master-0 kubenswrapper[23041]: I0308 00:53:23.993932 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="309c80e9-6a3a-45cb-93c9-216d39c74f61" containerName="cinder-675ba-db-sync" Mar 08 00:53:24.000557 master-0 kubenswrapper[23041]: I0308 00:53:23.999529 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.004646 master-0 kubenswrapper[23041]: I0308 00:53:24.004521 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-scripts" Mar 08 00:53:24.004646 master-0 kubenswrapper[23041]: I0308 00:53:24.004544 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-scheduler-config-data" Mar 08 00:53:24.004850 master-0 kubenswrapper[23041]: I0308 00:53:24.004792 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-config-data" Mar 08 00:53:24.014326 master-0 kubenswrapper[23041]: I0308 00:53:24.014268 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:24.056337 master-0 kubenswrapper[23041]: I0308 00:53:24.047777 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:24.076232 master-0 kubenswrapper[23041]: I0308 00:53:24.070899 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.086757 master-0 kubenswrapper[23041]: I0308 00:53:24.086151 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-volume-lvm-iscsi-config-data" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098517 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098581 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098663 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098718 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098735 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.106065 master-0 kubenswrapper[23041]: I0308 00:53:24.098793 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhg2r\" (UniqueName: \"kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262139 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhg2r\" (UniqueName: \"kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262437 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262467 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262582 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262678 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262709 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.264312 master-0 kubenswrapper[23041]: I0308 00:53:24.262726 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.265592 master-0 kubenswrapper[23041]: I0308 00:53:24.265561 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.272136 master-0 kubenswrapper[23041]: I0308 00:53:24.272095 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.274774 master-0 kubenswrapper[23041]: I0308 00:53:24.274735 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.275859 master-0 kubenswrapper[23041]: I0308 00:53:24.275751 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:24.277090 master-0 kubenswrapper[23041]: I0308 00:53:24.277063 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.284610 master-0 kubenswrapper[23041]: I0308 00:53:24.284561 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhg2r\" (UniqueName: \"kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r\") pod \"cinder-675ba-scheduler-0\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.311257 master-0 kubenswrapper[23041]: I0308 00:53:24.311166 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:24.348412 master-0 kubenswrapper[23041]: I0308 00:53:24.348103 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:24.348623 master-0 kubenswrapper[23041]: I0308 00:53:24.348451 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:24.350607 master-0 kubenswrapper[23041]: I0308 00:53:24.350427 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.357249 master-0 kubenswrapper[23041]: I0308 00:53:24.356663 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-backup-config-data" Mar 08 00:53:24.362452 master-0 kubenswrapper[23041]: I0308 00:53:24.362240 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:24.362637 master-0 kubenswrapper[23041]: I0308 00:53:24.362509 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="dnsmasq-dns" containerID="cri-o://450bb6f5a74724b5126661c98fd0b4b100ff69a645bbd83ee463ca14ee851886" gracePeriod=10 Mar 08 00:53:24.363871 master-0 kubenswrapper[23041]: I0308 00:53:24.363847 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:24.365289 master-0 kubenswrapper[23041]: I0308 00:53:24.365253 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365363 master-0 kubenswrapper[23041]: I0308 00:53:24.365289 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365363 master-0 kubenswrapper[23041]: I0308 00:53:24.365330 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365363 master-0 kubenswrapper[23041]: I0308 00:53:24.365351 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365506 master-0 kubenswrapper[23041]: I0308 00:53:24.365366 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365506 master-0 kubenswrapper[23041]: I0308 00:53:24.365422 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h44kl\" (UniqueName: \"kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365506 master-0 kubenswrapper[23041]: I0308 00:53:24.365472 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365600 master-0 kubenswrapper[23041]: I0308 00:53:24.365507 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365600 master-0 kubenswrapper[23041]: I0308 00:53:24.365555 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365600 master-0 kubenswrapper[23041]: I0308 00:53:24.365584 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365689 master-0 kubenswrapper[23041]: I0308 00:53:24.365607 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365689 master-0 kubenswrapper[23041]: I0308 00:53:24.365652 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365689 master-0 kubenswrapper[23041]: I0308 00:53:24.365678 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365796 master-0 kubenswrapper[23041]: I0308 00:53:24.365694 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.365796 master-0 kubenswrapper[23041]: I0308 00:53:24.365714 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471614 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471669 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471687 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471723 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471743 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h44kl\" (UniqueName: \"kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471766 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471792 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471816 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471843 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471862 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471888 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471917 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471934 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471960 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.471983 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472002 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472019 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472041 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472077 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472106 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472129 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdw7\" (UniqueName: \"kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472159 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472179 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472225 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472245 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472267 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472299 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472317 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472347 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472365 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.472460 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473138 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473504 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473716 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473754 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473817 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473852 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473875 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.473950 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.474438 master-0 kubenswrapper[23041]: I0308 00:53:24.474045 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.488923 master-0 kubenswrapper[23041]: I0308 00:53:24.478696 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.488923 master-0 kubenswrapper[23041]: I0308 00:53:24.483561 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.488923 master-0 kubenswrapper[23041]: I0308 00:53:24.483632 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:24.488923 master-0 kubenswrapper[23041]: I0308 00:53:24.485443 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.488923 master-0 kubenswrapper[23041]: I0308 00:53:24.488865 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.495343 master-0 kubenswrapper[23041]: I0308 00:53:24.495276 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:24.502130 master-0 kubenswrapper[23041]: I0308 00:53:24.501779 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h44kl\" (UniqueName: \"kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.508456 master-0 kubenswrapper[23041]: I0308 00:53:24.506978 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:24.514224 master-0 kubenswrapper[23041]: I0308 00:53:24.509005 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.523296 master-0 kubenswrapper[23041]: I0308 00:53:24.515456 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-api-config-data" Mar 08 00:53:24.524272 master-0 kubenswrapper[23041]: I0308 00:53:24.524101 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.538455 master-0 kubenswrapper[23041]: I0308 00:53:24.533472 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575795 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575858 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575908 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575929 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575968 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.575991 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576017 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576050 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576070 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6pm\" (UniqueName: \"kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576096 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576115 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576136 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576159 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576185 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576474 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576500 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576572 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576600 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.576796 master-0 kubenswrapper[23041]: I0308 00:53:24.576640 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577570 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577621 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577650 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577676 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577712 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577739 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577772 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdw7\" (UniqueName: \"kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577808 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.578239 master-0 kubenswrapper[23041]: I0308 00:53:24.577831 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578573 master-0 kubenswrapper[23041]: I0308 00:53:24.578322 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.578573 master-0 kubenswrapper[23041]: I0308 00:53:24.578366 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.585378 master-0 kubenswrapper[23041]: I0308 00:53:24.578852 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.585378 master-0 kubenswrapper[23041]: I0308 00:53:24.581316 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.585569 master-0 kubenswrapper[23041]: I0308 00:53:24.585497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.592223 master-0 kubenswrapper[23041]: I0308 00:53:24.585976 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.592223 master-0 kubenswrapper[23041]: I0308 00:53:24.586997 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.609253 master-0 kubenswrapper[23041]: I0308 00:53:24.602400 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdw7\" (UniqueName: \"kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7\") pod \"cinder-675ba-backup-0\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.661801 master-0 kubenswrapper[23041]: I0308 00:53:24.660622 23041 generic.go:334] "Generic (PLEG): container finished" podID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerID="450bb6f5a74724b5126661c98fd0b4b100ff69a645bbd83ee463ca14ee851886" exitCode=0 Mar 08 00:53:24.661801 master-0 kubenswrapper[23041]: I0308 00:53:24.660705 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" event={"ID":"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4","Type":"ContainerDied","Data":"450bb6f5a74724b5126661c98fd0b4b100ff69a645bbd83ee463ca14ee851886"} Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680191 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgkm9\" (UniqueName: \"kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680294 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680347 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680394 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680412 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680485 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680518 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680542 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6pm\" (UniqueName: \"kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680578 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680603 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680660 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680696 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.680725 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.681549 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.682723 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.682821 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.683751 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.685082 master-0 kubenswrapper[23041]: I0308 00:53:24.684095 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.697283 master-0 kubenswrapper[23041]: I0308 00:53:24.695429 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:24.713383 master-0 kubenswrapper[23041]: I0308 00:53:24.709519 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6pm\" (UniqueName: \"kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm\") pod \"dnsmasq-dns-78fdb4cf6c-nxlpt\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:24.735913 master-0 kubenswrapper[23041]: I0308 00:53:24.735274 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:24.786885 master-0 kubenswrapper[23041]: I0308 00:53:24.786609 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.786885 master-0 kubenswrapper[23041]: I0308 00:53:24.786671 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.786885 master-0 kubenswrapper[23041]: I0308 00:53:24.786734 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.786885 master-0 kubenswrapper[23041]: I0308 00:53:24.786759 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.786885 master-0 kubenswrapper[23041]: I0308 00:53:24.786863 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgkm9\" (UniqueName: \"kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.787104 master-0 kubenswrapper[23041]: I0308 00:53:24.786948 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.787104 master-0 kubenswrapper[23041]: I0308 00:53:24.786964 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.795981 master-0 kubenswrapper[23041]: I0308 00:53:24.790645 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.795981 master-0 kubenswrapper[23041]: I0308 00:53:24.791075 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.799792 master-0 kubenswrapper[23041]: I0308 00:53:24.799435 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.806874 master-0 kubenswrapper[23041]: I0308 00:53:24.803364 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.811217 master-0 kubenswrapper[23041]: I0308 00:53:24.811157 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgkm9\" (UniqueName: \"kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.814690 master-0 kubenswrapper[23041]: I0308 00:53:24.813731 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.814690 master-0 kubenswrapper[23041]: I0308 00:53:24.814631 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts\") pod \"cinder-675ba-api-0\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.868506 master-0 kubenswrapper[23041]: I0308 00:53:24.856445 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:24.988573 master-0 kubenswrapper[23041]: I0308 00:53:24.987153 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:25.011290 master-0 kubenswrapper[23041]: I0308 00:53:25.010701 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:25.127106 master-0 kubenswrapper[23041]: I0308 00:53:25.126662 23041 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:53:25.423437 master-0 kubenswrapper[23041]: I0308 00:53:25.421376 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:25.549969 master-0 kubenswrapper[23041]: I0308 00:53:25.549613 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.549969 master-0 kubenswrapper[23041]: I0308 00:53:25.549718 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.549969 master-0 kubenswrapper[23041]: I0308 00:53:25.549813 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.549969 master-0 kubenswrapper[23041]: I0308 00:53:25.549909 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzp8h\" (UniqueName: \"kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.549969 master-0 kubenswrapper[23041]: I0308 00:53:25.549943 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.550185 master-0 kubenswrapper[23041]: I0308 00:53:25.549993 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb\") pod \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\" (UID: \"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4\") " Mar 08 00:53:25.566998 master-0 kubenswrapper[23041]: I0308 00:53:25.566581 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:25.566998 master-0 kubenswrapper[23041]: I0308 00:53:25.566802 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h" (OuterVolumeSpecName: "kube-api-access-rzp8h") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "kube-api-access-rzp8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:25.617939 master-0 kubenswrapper[23041]: I0308 00:53:25.617882 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:25.630188 master-0 kubenswrapper[23041]: I0308 00:53:25.629300 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:25.632934 master-0 kubenswrapper[23041]: I0308 00:53:25.632184 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config" (OuterVolumeSpecName: "config") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:25.637225 master-0 kubenswrapper[23041]: I0308 00:53:25.636269 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:25.637225 master-0 kubenswrapper[23041]: I0308 00:53:25.636774 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" (UID: "230fb7a5-d46d-4b2e-b0c4-4d3f998564f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652731 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652771 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652784 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652796 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzp8h\" (UniqueName: \"kubernetes.io/projected/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-kube-api-access-rzp8h\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652806 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.653623 master-0 kubenswrapper[23041]: I0308 00:53:25.652814 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:25.666001 master-0 kubenswrapper[23041]: I0308 00:53:25.665228 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:25.683650 master-0 kubenswrapper[23041]: W0308 00:53:25.682773 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod854d6a39_df63_4aa0_85db_c8cd640dad73.slice/crio-68e35b56d047967e1039717426e363235c97b2d3e6a153bd2c625504ad520dac WatchSource:0}: Error finding container 68e35b56d047967e1039717426e363235c97b2d3e6a153bd2c625504ad520dac: Status 404 returned error can't find the container with id 68e35b56d047967e1039717426e363235c97b2d3e6a153bd2c625504ad520dac Mar 08 00:53:25.705808 master-0 kubenswrapper[23041]: I0308 00:53:25.705576 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerStarted","Data":"5e97e5c6fe4d443233c9b82654b811d8075c5bdd4d4ea6d58c7a1097085facf6"} Mar 08 00:53:25.708980 master-0 kubenswrapper[23041]: I0308 00:53:25.708777 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" event={"ID":"230fb7a5-d46d-4b2e-b0c4-4d3f998564f4","Type":"ContainerDied","Data":"1f963be9d0823e1f4411cc3cb7931975ab6b5178c0f3af05e672949fe4f3f509"} Mar 08 00:53:25.708980 master-0 kubenswrapper[23041]: I0308 00:53:25.708831 23041 scope.go:117] "RemoveContainer" containerID="450bb6f5a74724b5126661c98fd0b4b100ff69a645bbd83ee463ca14ee851886" Mar 08 00:53:25.709086 master-0 kubenswrapper[23041]: I0308 00:53:25.708994 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6456885d89-8nk8d" Mar 08 00:53:25.750813 master-0 kubenswrapper[23041]: I0308 00:53:25.750760 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerStarted","Data":"1feb921500e02a0006b019bc147e55e9d606e278c5c4e42f15399606585660af"} Mar 08 00:53:25.836850 master-0 kubenswrapper[23041]: I0308 00:53:25.836792 23041 scope.go:117] "RemoveContainer" containerID="f832601f1f9584306d99adaf226e4ab085eaf5328dc618255cbcdaae284c01a6" Mar 08 00:53:25.847387 master-0 kubenswrapper[23041]: W0308 00:53:25.847332 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce6c3b7e_5ea8_4629_8322_64431d8138c2.slice/crio-198e3ae32281ae1e35805b660c5b3230ca7bbaa432b75d9065471b6b32fbf4b9 WatchSource:0}: Error finding container 198e3ae32281ae1e35805b660c5b3230ca7bbaa432b75d9065471b6b32fbf4b9: Status 404 returned error can't find the container with id 198e3ae32281ae1e35805b660c5b3230ca7bbaa432b75d9065471b6b32fbf4b9 Mar 08 00:53:25.847474 master-0 kubenswrapper[23041]: I0308 00:53:25.847394 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:25.899092 master-0 kubenswrapper[23041]: I0308 00:53:25.899028 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:25.930229 master-0 kubenswrapper[23041]: I0308 00:53:25.928434 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:25.945221 master-0 kubenswrapper[23041]: I0308 00:53:25.940582 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6456885d89-8nk8d"] Mar 08 00:53:26.775711 master-0 kubenswrapper[23041]: I0308 00:53:26.774739 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerStarted","Data":"68e35b56d047967e1039717426e363235c97b2d3e6a153bd2c625504ad520dac"} Mar 08 00:53:26.781968 master-0 kubenswrapper[23041]: I0308 00:53:26.776992 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerStarted","Data":"198e3ae32281ae1e35805b660c5b3230ca7bbaa432b75d9065471b6b32fbf4b9"} Mar 08 00:53:26.781968 master-0 kubenswrapper[23041]: I0308 00:53:26.780227 23041 generic.go:334] "Generic (PLEG): container finished" podID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerID="32d865f83725fbb6bb86890318c0103679277255b7de0374f00442dd156a0cbf" exitCode=0 Mar 08 00:53:26.781968 master-0 kubenswrapper[23041]: I0308 00:53:26.780261 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" event={"ID":"0e016b3b-b18e-428e-8dd9-88c2e106d04e","Type":"ContainerDied","Data":"32d865f83725fbb6bb86890318c0103679277255b7de0374f00442dd156a0cbf"} Mar 08 00:53:26.781968 master-0 kubenswrapper[23041]: I0308 00:53:26.780281 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" event={"ID":"0e016b3b-b18e-428e-8dd9-88c2e106d04e","Type":"ContainerStarted","Data":"63af3d933e99bdf2cf37dbd29d00839fd89bb9b9d129d475d969b6992ce3307b"} Mar 08 00:53:26.847759 master-0 kubenswrapper[23041]: I0308 00:53:26.846758 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" path="/var/lib/kubelet/pods/230fb7a5-d46d-4b2e-b0c4-4d3f998564f4/volumes" Mar 08 00:53:27.258235 master-0 kubenswrapper[23041]: I0308 00:53:27.255326 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:27.808255 master-0 kubenswrapper[23041]: I0308 00:53:27.806047 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerStarted","Data":"a63f3c746658e03a4e6a6cf0c07748b530f0c8e00b2e9e0de2a4023609da55b9"} Mar 08 00:53:27.808255 master-0 kubenswrapper[23041]: I0308 00:53:27.806103 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerStarted","Data":"b1c509d3ee7a59378ecd438172682e742ea0bd453f239e6d7b13d5d4718bf09c"} Mar 08 00:53:27.836609 master-0 kubenswrapper[23041]: I0308 00:53:27.835541 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerStarted","Data":"1cca5bbda645178e609a6794a74a552fb2326c7e85251edfc181a52f0fa34e4e"} Mar 08 00:53:27.836609 master-0 kubenswrapper[23041]: I0308 00:53:27.835612 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerStarted","Data":"f2fc0cfd89ae39b165eaf29691bde0b893f56650ecd7d3f8a4de6208711a2dbd"} Mar 08 00:53:27.845801 master-0 kubenswrapper[23041]: I0308 00:53:27.845725 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-backup-0" podStartSLOduration=2.964854513 podStartE2EDuration="3.845701749s" podCreationTimestamp="2026-03-08 00:53:24 +0000 UTC" firstStartedPulling="2026-03-08 00:53:25.696495287 +0000 UTC m=+1311.169331841" lastFinishedPulling="2026-03-08 00:53:26.577342523 +0000 UTC m=+1312.050179077" observedRunningTime="2026-03-08 00:53:27.83755552 +0000 UTC m=+1313.310392084" watchObservedRunningTime="2026-03-08 00:53:27.845701749 +0000 UTC m=+1313.318538303" Mar 08 00:53:27.902708 master-0 kubenswrapper[23041]: I0308 00:53:27.902570 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerStarted","Data":"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576"} Mar 08 00:53:27.922870 master-0 kubenswrapper[23041]: I0308 00:53:27.922722 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" podStartSLOduration=3.889474372 podStartE2EDuration="4.922698661s" podCreationTimestamp="2026-03-08 00:53:23 +0000 UTC" firstStartedPulling="2026-03-08 00:53:25.544182415 +0000 UTC m=+1311.017018969" lastFinishedPulling="2026-03-08 00:53:26.577406704 +0000 UTC m=+1312.050243258" observedRunningTime="2026-03-08 00:53:27.919436451 +0000 UTC m=+1313.392273035" watchObservedRunningTime="2026-03-08 00:53:27.922698661 +0000 UTC m=+1313.395535215" Mar 08 00:53:27.925762 master-0 kubenswrapper[23041]: I0308 00:53:27.925715 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" event={"ID":"0e016b3b-b18e-428e-8dd9-88c2e106d04e","Type":"ContainerStarted","Data":"85068cdc6c66ca2ca86b0f23a6c863036e56acd0114d0e99e9172213bbc2eb52"} Mar 08 00:53:27.927443 master-0 kubenswrapper[23041]: I0308 00:53:27.927022 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:27.938002 master-0 kubenswrapper[23041]: I0308 00:53:27.937947 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerStarted","Data":"7f66ff667ebf8020e0e9fae1948e26561a4a3243fca8f100ab3bead266695768"} Mar 08 00:53:27.962720 master-0 kubenswrapper[23041]: I0308 00:53:27.962648 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" podStartSLOduration=3.962621847 podStartE2EDuration="3.962621847s" podCreationTimestamp="2026-03-08 00:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:27.941612523 +0000 UTC m=+1313.414449087" watchObservedRunningTime="2026-03-08 00:53:27.962621847 +0000 UTC m=+1313.435458401" Mar 08 00:53:28.957779 master-0 kubenswrapper[23041]: I0308 00:53:28.957718 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerStarted","Data":"689728ade1555da9aa7ab0a54753eece523deee9ae9728509194f7b404cdf1d5"} Mar 08 00:53:28.976336 master-0 kubenswrapper[23041]: I0308 00:53:28.976195 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-api-0" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-675ba-api-log" containerID="cri-o://b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" gracePeriod=30 Mar 08 00:53:28.976643 master-0 kubenswrapper[23041]: I0308 00:53:28.976606 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerStarted","Data":"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6"} Mar 08 00:53:28.978306 master-0 kubenswrapper[23041]: I0308 00:53:28.978248 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-api-0" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-api" containerID="cri-o://e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" gracePeriod=30 Mar 08 00:53:28.978485 master-0 kubenswrapper[23041]: I0308 00:53:28.978430 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:28.993144 master-0 kubenswrapper[23041]: I0308 00:53:28.992520 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-scheduler-0" podStartSLOduration=5.045294088 podStartE2EDuration="5.992498815s" podCreationTimestamp="2026-03-08 00:53:23 +0000 UTC" firstStartedPulling="2026-03-08 00:53:25.126624461 +0000 UTC m=+1310.599461015" lastFinishedPulling="2026-03-08 00:53:26.073829188 +0000 UTC m=+1311.546665742" observedRunningTime="2026-03-08 00:53:28.98205633 +0000 UTC m=+1314.454892894" watchObservedRunningTime="2026-03-08 00:53:28.992498815 +0000 UTC m=+1314.465335369" Mar 08 00:53:29.033241 master-0 kubenswrapper[23041]: I0308 00:53:29.029332 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-api-0" podStartSLOduration=5.029306235 podStartE2EDuration="5.029306235s" podCreationTimestamp="2026-03-08 00:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:29.012594246 +0000 UTC m=+1314.485430820" watchObservedRunningTime="2026-03-08 00:53:29.029306235 +0000 UTC m=+1314.502142809" Mar 08 00:53:29.351394 master-0 kubenswrapper[23041]: I0308 00:53:29.349181 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:29.697577 master-0 kubenswrapper[23041]: I0308 00:53:29.696106 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:29.737606 master-0 kubenswrapper[23041]: I0308 00:53:29.737483 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:29.755089 master-0 kubenswrapper[23041]: I0308 00:53:29.755061 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:29.835764 master-0 kubenswrapper[23041]: I0308 00:53:29.835661 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.836088 master-0 kubenswrapper[23041]: I0308 00:53:29.836067 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgkm9\" (UniqueName: \"kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.836507 master-0 kubenswrapper[23041]: I0308 00:53:29.836490 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.836625 master-0 kubenswrapper[23041]: I0308 00:53:29.836607 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.836848 master-0 kubenswrapper[23041]: I0308 00:53:29.836833 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.836972 master-0 kubenswrapper[23041]: I0308 00:53:29.836958 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.837308 master-0 kubenswrapper[23041]: I0308 00:53:29.837245 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts\") pod \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\" (UID: \"ce6c3b7e-5ea8-4629-8322-64431d8138c2\") " Mar 08 00:53:29.838351 master-0 kubenswrapper[23041]: I0308 00:53:29.838322 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs" (OuterVolumeSpecName: "logs") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:29.838444 master-0 kubenswrapper[23041]: I0308 00:53:29.838348 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:29.846462 master-0 kubenswrapper[23041]: I0308 00:53:29.846406 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9" (OuterVolumeSpecName: "kube-api-access-wgkm9") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "kube-api-access-wgkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:29.847491 master-0 kubenswrapper[23041]: I0308 00:53:29.847450 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts" (OuterVolumeSpecName: "scripts") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:29.847553 master-0 kubenswrapper[23041]: I0308 00:53:29.847497 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:29.865719 master-0 kubenswrapper[23041]: I0308 00:53:29.865622 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:29.892498 master-0 kubenswrapper[23041]: I0308 00:53:29.892447 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data" (OuterVolumeSpecName: "config-data") pod "ce6c3b7e-5ea8-4629-8322-64431d8138c2" (UID: "ce6c3b7e-5ea8-4629-8322-64431d8138c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939399 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939447 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939458 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939465 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6c3b7e-5ea8-4629-8322-64431d8138c2-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939476 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6c3b7e-5ea8-4629-8322-64431d8138c2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939484 23041 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce6c3b7e-5ea8-4629-8322-64431d8138c2-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.939477 master-0 kubenswrapper[23041]: I0308 00:53:29.939496 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgkm9\" (UniqueName: \"kubernetes.io/projected/ce6c3b7e-5ea8-4629-8322-64431d8138c2-kube-api-access-wgkm9\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:29.989109 master-0 kubenswrapper[23041]: I0308 00:53:29.988989 23041 generic.go:334] "Generic (PLEG): container finished" podID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerID="e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" exitCode=0 Mar 08 00:53:29.989109 master-0 kubenswrapper[23041]: I0308 00:53:29.989029 23041 generic.go:334] "Generic (PLEG): container finished" podID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerID="b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" exitCode=143 Mar 08 00:53:29.989109 master-0 kubenswrapper[23041]: I0308 00:53:29.989045 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:29.989649 master-0 kubenswrapper[23041]: I0308 00:53:29.989111 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerDied","Data":"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6"} Mar 08 00:53:29.989649 master-0 kubenswrapper[23041]: I0308 00:53:29.989141 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerDied","Data":"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576"} Mar 08 00:53:29.989649 master-0 kubenswrapper[23041]: I0308 00:53:29.989154 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"ce6c3b7e-5ea8-4629-8322-64431d8138c2","Type":"ContainerDied","Data":"198e3ae32281ae1e35805b660c5b3230ca7bbaa432b75d9065471b6b32fbf4b9"} Mar 08 00:53:29.989649 master-0 kubenswrapper[23041]: I0308 00:53:29.989171 23041 scope.go:117] "RemoveContainer" containerID="e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" Mar 08 00:53:29.993409 master-0 kubenswrapper[23041]: I0308 00:53:29.993374 23041 generic.go:334] "Generic (PLEG): container finished" podID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerID="8ea27f9a76b98f282a103da574cd5f4cf64dd2d1609d9ceb48b98ebe91c7d1de" exitCode=0 Mar 08 00:53:29.993469 master-0 kubenswrapper[23041]: I0308 00:53:29.993411 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerDied","Data":"8ea27f9a76b98f282a103da574cd5f4cf64dd2d1609d9ceb48b98ebe91c7d1de"} Mar 08 00:53:30.017099 master-0 kubenswrapper[23041]: I0308 00:53:30.015047 23041 scope.go:117] "RemoveContainer" containerID="b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" Mar 08 00:53:30.044326 master-0 kubenswrapper[23041]: I0308 00:53:30.044281 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:30.071673 master-0 kubenswrapper[23041]: I0308 00:53:30.071604 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:30.092785 master-0 kubenswrapper[23041]: I0308 00:53:30.092624 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: E0308 00:53:30.093183 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="dnsmasq-dns" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093214 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="dnsmasq-dns" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: E0308 00:53:30.093231 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-675ba-api-log" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093239 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-675ba-api-log" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: E0308 00:53:30.093259 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-api" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093265 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-api" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: E0308 00:53:30.093310 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="init" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093319 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="init" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093529 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-675ba-api-log" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093560 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" containerName="cinder-api" Mar 08 00:53:30.093792 master-0 kubenswrapper[23041]: I0308 00:53:30.093585 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="230fb7a5-d46d-4b2e-b0c4-4d3f998564f4" containerName="dnsmasq-dns" Mar 08 00:53:30.095570 master-0 kubenswrapper[23041]: I0308 00:53:30.094752 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.112386 master-0 kubenswrapper[23041]: I0308 00:53:30.102845 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 00:53:30.112386 master-0 kubenswrapper[23041]: I0308 00:53:30.102916 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-api-config-data" Mar 08 00:53:30.112386 master-0 kubenswrapper[23041]: I0308 00:53:30.103067 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 00:53:30.114440 master-0 kubenswrapper[23041]: I0308 00:53:30.113606 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:30.143633 master-0 kubenswrapper[23041]: I0308 00:53:30.143559 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/946a447d-964c-4693-8923-b712bcc9904c-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.143943 master-0 kubenswrapper[23041]: I0308 00:53:30.143711 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946a447d-964c-4693-8923-b712bcc9904c-logs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.143943 master-0 kubenswrapper[23041]: I0308 00:53:30.143740 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-internal-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.143943 master-0 kubenswrapper[23041]: I0308 00:53:30.143859 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-scripts\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.144128 master-0 kubenswrapper[23041]: I0308 00:53:30.144089 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4lf\" (UniqueName: \"kubernetes.io/projected/946a447d-964c-4693-8923-b712bcc9904c-kube-api-access-bh4lf\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.144238 master-0 kubenswrapper[23041]: I0308 00:53:30.144176 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.144388 master-0 kubenswrapper[23041]: I0308 00:53:30.144356 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-public-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.144450 master-0 kubenswrapper[23041]: I0308 00:53:30.144398 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.144518 master-0 kubenswrapper[23041]: I0308 00:53:30.144498 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.151612 master-0 kubenswrapper[23041]: I0308 00:53:30.151545 23041 scope.go:117] "RemoveContainer" containerID="e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" Mar 08 00:53:30.155770 master-0 kubenswrapper[23041]: E0308 00:53:30.155720 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6\": container with ID starting with e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6 not found: ID does not exist" containerID="e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" Mar 08 00:53:30.155916 master-0 kubenswrapper[23041]: I0308 00:53:30.155786 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6"} err="failed to get container status \"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6\": rpc error: code = NotFound desc = could not find container \"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6\": container with ID starting with e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6 not found: ID does not exist" Mar 08 00:53:30.155916 master-0 kubenswrapper[23041]: I0308 00:53:30.155815 23041 scope.go:117] "RemoveContainer" containerID="b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" Mar 08 00:53:30.156456 master-0 kubenswrapper[23041]: E0308 00:53:30.156406 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576\": container with ID starting with b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576 not found: ID does not exist" containerID="b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" Mar 08 00:53:30.156507 master-0 kubenswrapper[23041]: I0308 00:53:30.156474 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576"} err="failed to get container status \"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576\": rpc error: code = NotFound desc = could not find container \"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576\": container with ID starting with b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576 not found: ID does not exist" Mar 08 00:53:30.156507 master-0 kubenswrapper[23041]: I0308 00:53:30.156500 23041 scope.go:117] "RemoveContainer" containerID="e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6" Mar 08 00:53:30.156934 master-0 kubenswrapper[23041]: I0308 00:53:30.156895 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6"} err="failed to get container status \"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6\": rpc error: code = NotFound desc = could not find container \"e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6\": container with ID starting with e01ac1f3c8cd0734a08b2446c31337e81f4a664916d2775c41306266cb1e8cd6 not found: ID does not exist" Mar 08 00:53:30.156934 master-0 kubenswrapper[23041]: I0308 00:53:30.156921 23041 scope.go:117] "RemoveContainer" containerID="b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576" Mar 08 00:53:30.157313 master-0 kubenswrapper[23041]: I0308 00:53:30.157244 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576"} err="failed to get container status \"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576\": rpc error: code = NotFound desc = could not find container \"b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576\": container with ID starting with b7c1ff43b4e419fa2575eaa3ac135ce1a4a1fa302c2bd920411020980cd31576 not found: ID does not exist" Mar 08 00:53:30.245901 master-0 kubenswrapper[23041]: I0308 00:53:30.245846 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/946a447d-964c-4693-8923-b712bcc9904c-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246093 master-0 kubenswrapper[23041]: I0308 00:53:30.245943 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946a447d-964c-4693-8923-b712bcc9904c-logs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246093 master-0 kubenswrapper[23041]: I0308 00:53:30.246034 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-internal-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246093 master-0 kubenswrapper[23041]: I0308 00:53:30.246072 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-scripts\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246240 master-0 kubenswrapper[23041]: I0308 00:53:30.246061 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/946a447d-964c-4693-8923-b712bcc9904c-etc-machine-id\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246240 master-0 kubenswrapper[23041]: I0308 00:53:30.246117 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4lf\" (UniqueName: \"kubernetes.io/projected/946a447d-964c-4693-8923-b712bcc9904c-kube-api-access-bh4lf\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246452 master-0 kubenswrapper[23041]: I0308 00:53:30.246421 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246497 master-0 kubenswrapper[23041]: I0308 00:53:30.246484 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-public-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246532 master-0 kubenswrapper[23041]: I0308 00:53:30.246510 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246567 master-0 kubenswrapper[23041]: I0308 00:53:30.246551 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.246623 master-0 kubenswrapper[23041]: I0308 00:53:30.246587 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946a447d-964c-4693-8923-b712bcc9904c-logs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.249531 master-0 kubenswrapper[23041]: I0308 00:53:30.249504 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-scripts\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.261355 master-0 kubenswrapper[23041]: I0308 00:53:30.250526 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.261355 master-0 kubenswrapper[23041]: I0308 00:53:30.251156 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-public-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.261355 master-0 kubenswrapper[23041]: I0308 00:53:30.251540 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-combined-ca-bundle\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.261355 master-0 kubenswrapper[23041]: I0308 00:53:30.252015 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-internal-tls-certs\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.261355 master-0 kubenswrapper[23041]: I0308 00:53:30.253726 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/946a447d-964c-4693-8923-b712bcc9904c-config-data-custom\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.268583 master-0 kubenswrapper[23041]: I0308 00:53:30.262440 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4lf\" (UniqueName: \"kubernetes.io/projected/946a447d-964c-4693-8923-b712bcc9904c-kube-api-access-bh4lf\") pod \"cinder-675ba-api-0\" (UID: \"946a447d-964c-4693-8923-b712bcc9904c\") " pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.435656 master-0 kubenswrapper[23041]: I0308 00:53:30.435574 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:30.852273 master-0 kubenswrapper[23041]: I0308 00:53:30.852094 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6c3b7e-5ea8-4629-8322-64431d8138c2" path="/var/lib/kubelet/pods/ce6c3b7e-5ea8-4629-8322-64431d8138c2/volumes" Mar 08 00:53:30.919936 master-0 kubenswrapper[23041]: I0308 00:53:30.919858 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-api-0"] Mar 08 00:53:31.065277 master-0 kubenswrapper[23041]: I0308 00:53:31.063984 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"946a447d-964c-4693-8923-b712bcc9904c","Type":"ContainerStarted","Data":"de7ba8bbcaf7ec943f3266d2b3eec4ce7b481ba5ea9de24091348094ae151469"} Mar 08 00:53:31.599097 master-0 kubenswrapper[23041]: I0308 00:53:31.599046 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-hxms8" Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.682934 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.682986 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.683070 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.683166 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.683227 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.684276 master-0 kubenswrapper[23041]: I0308 00:53:31.683337 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn2dr\" (UniqueName: \"kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr\") pod \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\" (UID: \"24c18a5d-ebab-491a-8bf4-f6271242e4f3\") " Mar 08 00:53:31.686844 master-0 kubenswrapper[23041]: I0308 00:53:31.686294 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 00:53:31.686844 master-0 kubenswrapper[23041]: I0308 00:53:31.686533 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts" (OuterVolumeSpecName: "scripts") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:31.686844 master-0 kubenswrapper[23041]: I0308 00:53:31.686611 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:31.690358 master-0 kubenswrapper[23041]: I0308 00:53:31.689434 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr" (OuterVolumeSpecName: "kube-api-access-mn2dr") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "kube-api-access-mn2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:31.736804 master-0 kubenswrapper[23041]: I0308 00:53:31.736746 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data" (OuterVolumeSpecName: "config-data") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:31.776447 master-0 kubenswrapper[23041]: I0308 00:53:31.776386 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24c18a5d-ebab-491a-8bf4-f6271242e4f3" (UID: "24c18a5d-ebab-491a-8bf4-f6271242e4f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:31.786563 master-0 kubenswrapper[23041]: I0308 00:53:31.786505 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn2dr\" (UniqueName: \"kubernetes.io/projected/24c18a5d-ebab-491a-8bf4-f6271242e4f3-kube-api-access-mn2dr\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:31.786563 master-0 kubenswrapper[23041]: I0308 00:53:31.786549 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:31.786563 master-0 kubenswrapper[23041]: I0308 00:53:31.786562 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:31.786563 master-0 kubenswrapper[23041]: I0308 00:53:31.786571 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c18a5d-ebab-491a-8bf4-f6271242e4f3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:31.786852 master-0 kubenswrapper[23041]: I0308 00:53:31.786581 23041 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/24c18a5d-ebab-491a-8bf4-f6271242e4f3-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:31.786852 master-0 kubenswrapper[23041]: I0308 00:53:31.786590 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/24c18a5d-ebab-491a-8bf4-f6271242e4f3-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:32.101909 master-0 kubenswrapper[23041]: I0308 00:53:32.101795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"946a447d-964c-4693-8923-b712bcc9904c","Type":"ContainerStarted","Data":"60d0c3a1d2ab96189446799e17654dfeb712831686f611900a14414cd05db81c"} Mar 08 00:53:32.105412 master-0 kubenswrapper[23041]: I0308 00:53:32.105359 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-hxms8" event={"ID":"24c18a5d-ebab-491a-8bf4-f6271242e4f3","Type":"ContainerDied","Data":"efc239c5bde71c1bf2cf30fb2848c3d9f4a95f4499c6738d97baa20a177ada96"} Mar 08 00:53:32.105497 master-0 kubenswrapper[23041]: I0308 00:53:32.105418 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc239c5bde71c1bf2cf30fb2848c3d9f4a95f4499c6738d97baa20a177ada96" Mar 08 00:53:32.105497 master-0 kubenswrapper[23041]: I0308 00:53:32.105424 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-hxms8" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: I0308 00:53:32.513315 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-hzm2x"] Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: E0308 00:53:32.513838 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerName="ironic-db-sync" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: I0308 00:53:32.513851 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerName="ironic-db-sync" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: E0308 00:53:32.513881 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerName="init" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: I0308 00:53:32.513887 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerName="init" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: I0308 00:53:32.514090 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" containerName="ironic-db-sync" Mar 08 00:53:32.515230 master-0 kubenswrapper[23041]: I0308 00:53:32.514886 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.572336 master-0 kubenswrapper[23041]: I0308 00:53:32.566325 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-hzm2x"] Mar 08 00:53:32.623117 master-0 kubenswrapper[23041]: I0308 00:53:32.623039 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.631227 master-0 kubenswrapper[23041]: I0308 00:53:32.629993 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.647294 master-0 kubenswrapper[23041]: I0308 00:53:32.631098 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-7dffdc6989-dw4bq"] Mar 08 00:53:32.657233 master-0 kubenswrapper[23041]: I0308 00:53:32.653240 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.666226 master-0 kubenswrapper[23041]: I0308 00:53:32.659566 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 08 00:53:32.735344 master-0 kubenswrapper[23041]: I0308 00:53:32.735295 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.735569 master-0 kubenswrapper[23041]: I0308 00:53:32.735426 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.736841 master-0 kubenswrapper[23041]: I0308 00:53:32.736632 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.751713 master-0 kubenswrapper[23041]: I0308 00:53:32.751661 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-c3c2-account-create-update-w6k86"] Mar 08 00:53:32.759229 master-0 kubenswrapper[23041]: I0308 00:53:32.756536 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:32.767814 master-0 kubenswrapper[23041]: I0308 00:53:32.767767 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 08 00:53:32.799834 master-0 kubenswrapper[23041]: I0308 00:53:32.799784 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq\") pod \"ironic-inspector-db-create-hzm2x\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.872356 master-0 kubenswrapper[23041]: I0308 00:53:32.871669 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-combined-ca-bundle\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.872356 master-0 kubenswrapper[23041]: I0308 00:53:32.871971 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:32.872356 master-0 kubenswrapper[23041]: I0308 00:53:32.872128 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-config\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.872356 master-0 kubenswrapper[23041]: I0308 00:53:32.872153 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk8xq\" (UniqueName: \"kubernetes.io/projected/a94dba9c-1e25-42ed-b30a-d278979d1de9-kube-api-access-dk8xq\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.876404 master-0 kubenswrapper[23041]: I0308 00:53:32.875824 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-7dffdc6989-dw4bq"] Mar 08 00:53:32.900012 master-0 kubenswrapper[23041]: I0308 00:53:32.899539 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-c3c2-account-create-update-w6k86"] Mar 08 00:53:32.900012 master-0 kubenswrapper[23041]: I0308 00:53:32.899594 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:32.900334 master-0 kubenswrapper[23041]: I0308 00:53:32.900261 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="dnsmasq-dns" containerID="cri-o://85068cdc6c66ca2ca86b0f23a6c863036e56acd0114d0e99e9172213bbc2eb52" gracePeriod=10 Mar 08 00:53:32.907480 master-0 kubenswrapper[23041]: I0308 00:53:32.907410 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:32.952229 master-0 kubenswrapper[23041]: I0308 00:53:32.941439 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.968509 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.974931 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.975003 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-combined-ca-bundle\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.975051 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2xp\" (UniqueName: \"kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.975158 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-config\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.976690 master-0 kubenswrapper[23041]: I0308 00:53:32.975180 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk8xq\" (UniqueName: \"kubernetes.io/projected/a94dba9c-1e25-42ed-b30a-d278979d1de9-kube-api-access-dk8xq\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.982142 master-0 kubenswrapper[23041]: I0308 00:53:32.981005 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-combined-ca-bundle\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.986868 master-0 kubenswrapper[23041]: I0308 00:53:32.986775 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a94dba9c-1e25-42ed-b30a-d278979d1de9-config\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:32.987227 master-0 kubenswrapper[23041]: I0308 00:53:32.987190 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 08 00:53:32.987408 master-0 kubenswrapper[23041]: I0308 00:53:32.987388 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 00:53:32.987525 master-0 kubenswrapper[23041]: I0308 00:53:32.987507 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 08 00:53:32.987687 master-0 kubenswrapper[23041]: I0308 00:53:32.987668 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 00:53:32.987887 master-0 kubenswrapper[23041]: I0308 00:53:32.987867 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 08 00:53:32.992398 master-0 kubenswrapper[23041]: I0308 00:53:32.992344 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:33.004461 master-0 kubenswrapper[23041]: I0308 00:53:33.004409 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:53:33.006486 master-0 kubenswrapper[23041]: I0308 00:53:33.006435 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.042336 master-0 kubenswrapper[23041]: I0308 00:53:33.036762 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:53:33.077848 master-0 kubenswrapper[23041]: I0308 00:53:33.077791 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.077848 master-0 kubenswrapper[23041]: I0308 00:53:33.077849 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078083 master-0 kubenswrapper[23041]: I0308 00:53:33.077901 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078083 master-0 kubenswrapper[23041]: I0308 00:53:33.077932 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078083 master-0 kubenswrapper[23041]: I0308 00:53:33.077970 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:33.078083 master-0 kubenswrapper[23041]: I0308 00:53:33.078050 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqxz\" (UniqueName: \"kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078083 master-0 kubenswrapper[23041]: I0308 00:53:33.078077 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2xp\" (UniqueName: \"kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:33.078322 master-0 kubenswrapper[23041]: I0308 00:53:33.078108 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078322 master-0 kubenswrapper[23041]: I0308 00:53:33.078176 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.078322 master-0 kubenswrapper[23041]: I0308 00:53:33.078258 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.085317 master-0 kubenswrapper[23041]: I0308 00:53:33.081694 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:33.135880 master-0 kubenswrapper[23041]: I0308 00:53:33.135825 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-api-0" event={"ID":"946a447d-964c-4693-8923-b712bcc9904c","Type":"ContainerStarted","Data":"6a72261080093faf7123c538f3adc537064cf09f704e23981dee91be99c346ea"} Mar 08 00:53:33.136808 master-0 kubenswrapper[23041]: I0308 00:53:33.136769 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:33.179695 master-0 kubenswrapper[23041]: I0308 00:53:33.179540 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.179695 master-0 kubenswrapper[23041]: I0308 00:53:33.179600 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179767 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179815 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179849 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqxz\" (UniqueName: \"kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179877 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179911 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179935 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.179958 master-0 kubenswrapper[23041]: I0308 00:53:33.179961 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.180230 master-0 kubenswrapper[23041]: I0308 00:53:33.179989 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrwh\" (UniqueName: \"kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.180230 master-0 kubenswrapper[23041]: I0308 00:53:33.180007 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.180230 master-0 kubenswrapper[23041]: I0308 00:53:33.180035 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.180326 master-0 kubenswrapper[23041]: I0308 00:53:33.180280 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.180894 master-0 kubenswrapper[23041]: I0308 00:53:33.180862 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.180970 master-0 kubenswrapper[23041]: I0308 00:53:33.180903 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.181518 master-0 kubenswrapper[23041]: I0308 00:53:33.181167 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.183112 master-0 kubenswrapper[23041]: I0308 00:53:33.183069 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.184040 master-0 kubenswrapper[23041]: I0308 00:53:33.184011 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.184849 master-0 kubenswrapper[23041]: I0308 00:53:33.184359 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.184849 master-0 kubenswrapper[23041]: I0308 00:53:33.184794 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.185606 master-0 kubenswrapper[23041]: I0308 00:53:33.185575 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.283023 master-0 kubenswrapper[23041]: I0308 00:53:33.282899 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.283324 master-0 kubenswrapper[23041]: I0308 00:53:33.283077 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.283324 master-0 kubenswrapper[23041]: I0308 00:53:33.283182 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.283424 master-0 kubenswrapper[23041]: I0308 00:53:33.283361 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrwh\" (UniqueName: \"kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.283424 master-0 kubenswrapper[23041]: I0308 00:53:33.283384 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.284755 master-0 kubenswrapper[23041]: I0308 00:53:33.283818 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.284755 master-0 kubenswrapper[23041]: I0308 00:53:33.284700 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.286364 master-0 kubenswrapper[23041]: I0308 00:53:33.286323 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.287048 master-0 kubenswrapper[23041]: I0308 00:53:33.287016 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.287111 master-0 kubenswrapper[23041]: I0308 00:53:33.287050 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.287111 master-0 kubenswrapper[23041]: I0308 00:53:33.287072 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.363487 master-0 kubenswrapper[23041]: I0308 00:53:33.358054 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrwh\" (UniqueName: \"kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh\") pod \"dnsmasq-dns-6bf78b7-cqc9l\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.369302 master-0 kubenswrapper[23041]: I0308 00:53:33.369001 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqxz\" (UniqueName: \"kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz\") pod \"ironic-657ddbd5bb-fdfgw\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.375367 master-0 kubenswrapper[23041]: I0308 00:53:33.371841 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk8xq\" (UniqueName: \"kubernetes.io/projected/a94dba9c-1e25-42ed-b30a-d278979d1de9-kube-api-access-dk8xq\") pod \"ironic-neutron-agent-7dffdc6989-dw4bq\" (UID: \"a94dba9c-1e25-42ed-b30a-d278979d1de9\") " pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:33.378017 master-0 kubenswrapper[23041]: I0308 00:53:33.377971 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2xp\" (UniqueName: \"kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp\") pod \"ironic-inspector-c3c2-account-create-update-w6k86\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:33.409722 master-0 kubenswrapper[23041]: I0308 00:53:33.404678 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-api-0" podStartSLOduration=3.404657809 podStartE2EDuration="3.404657809s" podCreationTimestamp="2026-03-08 00:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:33.397973896 +0000 UTC m=+1318.870810450" watchObservedRunningTime="2026-03-08 00:53:33.404657809 +0000 UTC m=+1318.877494363" Mar 08 00:53:33.447304 master-0 kubenswrapper[23041]: I0308 00:53:33.442231 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:33.456621 master-0 kubenswrapper[23041]: I0308 00:53:33.455579 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:33.514167 master-0 kubenswrapper[23041]: I0308 00:53:33.506785 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:33.685652 master-0 kubenswrapper[23041]: I0308 00:53:33.668873 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:33.879260 master-0 kubenswrapper[23041]: I0308 00:53:33.875274 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-hzm2x"] Mar 08 00:53:34.291365 master-0 kubenswrapper[23041]: I0308 00:53:34.285536 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-hzm2x" event={"ID":"5944b058-fd8b-419b-ba55-b61f85254dec","Type":"ContainerStarted","Data":"8482eaf6946769d9181392138942ce81af3fd9032d5d457e55bafb75c41a0d17"} Mar 08 00:53:34.311690 master-0 kubenswrapper[23041]: I0308 00:53:34.306627 23041 generic.go:334] "Generic (PLEG): container finished" podID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerID="85068cdc6c66ca2ca86b0f23a6c863036e56acd0114d0e99e9172213bbc2eb52" exitCode=0 Mar 08 00:53:34.311690 master-0 kubenswrapper[23041]: I0308 00:53:34.306699 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" event={"ID":"0e016b3b-b18e-428e-8dd9-88c2e106d04e","Type":"ContainerDied","Data":"85068cdc6c66ca2ca86b0f23a6c863036e56acd0114d0e99e9172213bbc2eb52"} Mar 08 00:53:34.590724 master-0 kubenswrapper[23041]: I0308 00:53:34.589564 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:34.760333 master-0 kubenswrapper[23041]: I0308 00:53:34.755001 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:53:34.772052 master-0 kubenswrapper[23041]: I0308 00:53:34.772006 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:34.804391 master-0 kubenswrapper[23041]: I0308 00:53:34.803464 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:34.884084 master-0 kubenswrapper[23041]: I0308 00:53:34.884018 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.884294 master-0 kubenswrapper[23041]: I0308 00:53:34.884132 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.884294 master-0 kubenswrapper[23041]: I0308 00:53:34.884154 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j6pm\" (UniqueName: \"kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.939698 master-0 kubenswrapper[23041]: I0308 00:53:34.939617 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm" (OuterVolumeSpecName: "kube-api-access-4j6pm") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "kube-api-access-4j6pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:34.946483 master-0 kubenswrapper[23041]: I0308 00:53:34.946394 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-c3c2-account-create-update-w6k86"] Mar 08 00:53:34.946483 master-0 kubenswrapper[23041]: I0308 00:53:34.946435 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:34.977427 master-0 kubenswrapper[23041]: I0308 00:53:34.977261 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:34.990991 master-0 kubenswrapper[23041]: I0308 00:53:34.989622 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.990991 master-0 kubenswrapper[23041]: I0308 00:53:34.989733 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.990991 master-0 kubenswrapper[23041]: I0308 00:53:34.989773 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:34.990991 master-0 kubenswrapper[23041]: I0308 00:53:34.990289 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:34.990991 master-0 kubenswrapper[23041]: I0308 00:53:34.990306 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j6pm\" (UniqueName: \"kubernetes.io/projected/0e016b3b-b18e-428e-8dd9-88c2e106d04e-kube-api-access-4j6pm\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:35.091795 master-0 kubenswrapper[23041]: I0308 00:53:35.090796 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:35.091795 master-0 kubenswrapper[23041]: I0308 00:53:35.091457 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") pod \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\" (UID: \"0e016b3b-b18e-428e-8dd9-88c2e106d04e\") " Mar 08 00:53:35.091795 master-0 kubenswrapper[23041]: W0308 00:53:35.091718 23041 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0e016b3b-b18e-428e-8dd9-88c2e106d04e/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 08 00:53:35.091795 master-0 kubenswrapper[23041]: I0308 00:53:35.091756 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:35.092457 master-0 kubenswrapper[23041]: I0308 00:53:35.092401 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:35.110089 master-0 kubenswrapper[23041]: I0308 00:53:35.110004 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-7dffdc6989-dw4bq"] Mar 08 00:53:35.113832 master-0 kubenswrapper[23041]: I0308 00:53:35.112787 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:35.176911 master-0 kubenswrapper[23041]: I0308 00:53:35.176457 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:35.262111 master-0 kubenswrapper[23041]: I0308 00:53:35.260929 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:35.288602 master-0 kubenswrapper[23041]: I0308 00:53:35.288454 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config" (OuterVolumeSpecName: "config") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:35.312702 master-0 kubenswrapper[23041]: I0308 00:53:35.312622 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:35.316656 master-0 kubenswrapper[23041]: I0308 00:53:35.316596 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:35.343469 master-0 kubenswrapper[23041]: I0308 00:53:35.340663 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" event={"ID":"0e016b3b-b18e-428e-8dd9-88c2e106d04e","Type":"ContainerDied","Data":"63af3d933e99bdf2cf37dbd29d00839fd89bb9b9d129d475d969b6992ce3307b"} Mar 08 00:53:35.343469 master-0 kubenswrapper[23041]: I0308 00:53:35.340729 23041 scope.go:117] "RemoveContainer" containerID="85068cdc6c66ca2ca86b0f23a6c863036e56acd0114d0e99e9172213bbc2eb52" Mar 08 00:53:35.343469 master-0 kubenswrapper[23041]: I0308 00:53:35.340859 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78fdb4cf6c-nxlpt" Mar 08 00:53:35.362087 master-0 kubenswrapper[23041]: I0308 00:53:35.361350 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerStarted","Data":"54701ef6952f63b967319cc4bc8a773790e8135f554f1520eea7fccbb70bdcf4"} Mar 08 00:53:35.383437 master-0 kubenswrapper[23041]: I0308 00:53:35.382779 23041 generic.go:334] "Generic (PLEG): container finished" podID="5944b058-fd8b-419b-ba55-b61f85254dec" containerID="563a865423a1bd0c30be7208a748f13ef1b942095bffe87483ce3bde705bfbc0" exitCode=0 Mar 08 00:53:35.383437 master-0 kubenswrapper[23041]: I0308 00:53:35.382861 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-hzm2x" event={"ID":"5944b058-fd8b-419b-ba55-b61f85254dec","Type":"ContainerDied","Data":"563a865423a1bd0c30be7208a748f13ef1b942095bffe87483ce3bde705bfbc0"} Mar 08 00:53:35.432056 master-0 kubenswrapper[23041]: I0308 00:53:35.431520 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerStarted","Data":"91510b54e27b310630ef3406f0f3c1acedd137907e22cac56d65797ad2050f69"} Mar 08 00:53:35.442879 master-0 kubenswrapper[23041]: I0308 00:53:35.442588 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" event={"ID":"9a83bcf0-62ae-4284-b870-14ba623be2e1","Type":"ContainerStarted","Data":"b733f536418a0c47d4085a9e572557c0b9c02c718e16ac29e9a233f891167c60"} Mar 08 00:53:35.447012 master-0 kubenswrapper[23041]: I0308 00:53:35.446837 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-scheduler-0" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="cinder-scheduler" containerID="cri-o://7f66ff667ebf8020e0e9fae1948e26561a4a3243fca8f100ab3bead266695768" gracePeriod=30 Mar 08 00:53:35.447012 master-0 kubenswrapper[23041]: I0308 00:53:35.446960 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" event={"ID":"c957f6fb-9546-4811-9246-6a1bfa49492e","Type":"ContainerStarted","Data":"b29bb134531228243433c1a1ee141edca7ad196a354b913a4a900166c1a6443c"} Mar 08 00:53:35.447692 master-0 kubenswrapper[23041]: I0308 00:53:35.447664 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="cinder-volume" containerID="cri-o://f2fc0cfd89ae39b165eaf29691bde0b893f56650ecd7d3f8a4de6208711a2dbd" gracePeriod=30 Mar 08 00:53:35.447951 master-0 kubenswrapper[23041]: I0308 00:53:35.447924 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-backup-0" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="cinder-backup" containerID="cri-o://b1c509d3ee7a59378ecd438172682e742ea0bd453f239e6d7b13d5d4718bf09c" gracePeriod=30 Mar 08 00:53:35.448409 master-0 kubenswrapper[23041]: I0308 00:53:35.448372 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-scheduler-0" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="probe" containerID="cri-o://689728ade1555da9aa7ab0a54753eece523deee9ae9728509194f7b404cdf1d5" gracePeriod=30 Mar 08 00:53:35.448577 master-0 kubenswrapper[23041]: I0308 00:53:35.448554 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="probe" containerID="cri-o://1cca5bbda645178e609a6794a74a552fb2326c7e85251edfc181a52f0fa34e4e" gracePeriod=30 Mar 08 00:53:35.448640 master-0 kubenswrapper[23041]: I0308 00:53:35.448620 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-675ba-backup-0" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="probe" containerID="cri-o://a63f3c746658e03a4e6a6cf0c07748b530f0c8e00b2e9e0de2a4023609da55b9" gracePeriod=30 Mar 08 00:53:35.528658 master-0 kubenswrapper[23041]: I0308 00:53:35.528508 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:35.541104 master-0 kubenswrapper[23041]: I0308 00:53:35.541045 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e016b3b-b18e-428e-8dd9-88c2e106d04e" (UID: "0e016b3b-b18e-428e-8dd9-88c2e106d04e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:35.543099 master-0 kubenswrapper[23041]: I0308 00:53:35.543044 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:35.649281 master-0 kubenswrapper[23041]: I0308 00:53:35.645872 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e016b3b-b18e-428e-8dd9-88c2e106d04e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:35.743330 master-0 kubenswrapper[23041]: I0308 00:53:35.743283 23041 scope.go:117] "RemoveContainer" containerID="32d865f83725fbb6bb86890318c0103679277255b7de0374f00442dd156a0cbf" Mar 08 00:53:35.778303 master-0 kubenswrapper[23041]: I0308 00:53:35.774480 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:35.798716 master-0 kubenswrapper[23041]: I0308 00:53:35.796844 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78fdb4cf6c-nxlpt"] Mar 08 00:53:36.492225 master-0 kubenswrapper[23041]: I0308 00:53:36.490427 23041 generic.go:334] "Generic (PLEG): container finished" podID="a4aef1ca-0703-4433-84e6-a926cea94033" containerID="689728ade1555da9aa7ab0a54753eece523deee9ae9728509194f7b404cdf1d5" exitCode=0 Mar 08 00:53:36.492225 master-0 kubenswrapper[23041]: I0308 00:53:36.490466 23041 generic.go:334] "Generic (PLEG): container finished" podID="a4aef1ca-0703-4433-84e6-a926cea94033" containerID="7f66ff667ebf8020e0e9fae1948e26561a4a3243fca8f100ab3bead266695768" exitCode=0 Mar 08 00:53:36.492225 master-0 kubenswrapper[23041]: I0308 00:53:36.490505 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerDied","Data":"689728ade1555da9aa7ab0a54753eece523deee9ae9728509194f7b404cdf1d5"} Mar 08 00:53:36.492225 master-0 kubenswrapper[23041]: I0308 00:53:36.490530 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerDied","Data":"7f66ff667ebf8020e0e9fae1948e26561a4a3243fca8f100ab3bead266695768"} Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.493818 23041 generic.go:334] "Generic (PLEG): container finished" podID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerID="1cca5bbda645178e609a6794a74a552fb2326c7e85251edfc181a52f0fa34e4e" exitCode=0 Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.493841 23041 generic.go:334] "Generic (PLEG): container finished" podID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerID="f2fc0cfd89ae39b165eaf29691bde0b893f56650ecd7d3f8a4de6208711a2dbd" exitCode=0 Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.493874 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerDied","Data":"1cca5bbda645178e609a6794a74a552fb2326c7e85251edfc181a52f0fa34e4e"} Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.493893 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerDied","Data":"f2fc0cfd89ae39b165eaf29691bde0b893f56650ecd7d3f8a4de6208711a2dbd"} Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.495431 23041 generic.go:334] "Generic (PLEG): container finished" podID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerID="2908f761aa1e9d490c9c35d272e8c239ec37bf7e2bd973e5b3d3a3ba3e72c66d" exitCode=0 Mar 08 00:53:36.496224 master-0 kubenswrapper[23041]: I0308 00:53:36.495465 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" event={"ID":"9a83bcf0-62ae-4284-b870-14ba623be2e1","Type":"ContainerDied","Data":"2908f761aa1e9d490c9c35d272e8c239ec37bf7e2bd973e5b3d3a3ba3e72c66d"} Mar 08 00:53:36.503111 master-0 kubenswrapper[23041]: I0308 00:53:36.502317 23041 generic.go:334] "Generic (PLEG): container finished" podID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerID="a63f3c746658e03a4e6a6cf0c07748b530f0c8e00b2e9e0de2a4023609da55b9" exitCode=0 Mar 08 00:53:36.503111 master-0 kubenswrapper[23041]: I0308 00:53:36.502378 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerDied","Data":"a63f3c746658e03a4e6a6cf0c07748b530f0c8e00b2e9e0de2a4023609da55b9"} Mar 08 00:53:36.507282 master-0 kubenswrapper[23041]: I0308 00:53:36.504773 23041 generic.go:334] "Generic (PLEG): container finished" podID="c957f6fb-9546-4811-9246-6a1bfa49492e" containerID="80c7b46207dd5e70c18cbc6c2185c2b01907c97e9f930c730a89d9561aa77a89" exitCode=0 Mar 08 00:53:36.507282 master-0 kubenswrapper[23041]: I0308 00:53:36.504821 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" event={"ID":"c957f6fb-9546-4811-9246-6a1bfa49492e","Type":"ContainerDied","Data":"80c7b46207dd5e70c18cbc6c2185c2b01907c97e9f930c730a89d9561aa77a89"} Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: I0308 00:53:36.726687 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: E0308 00:53:36.727224 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="init" Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: I0308 00:53:36.727239 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="init" Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: E0308 00:53:36.727256 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="dnsmasq-dns" Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: I0308 00:53:36.727264 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="dnsmasq-dns" Mar 08 00:53:36.729932 master-0 kubenswrapper[23041]: I0308 00:53:36.727484 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" containerName="dnsmasq-dns" Mar 08 00:53:36.732436 master-0 kubenswrapper[23041]: I0308 00:53:36.730423 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 00:53:36.733552 master-0 kubenswrapper[23041]: I0308 00:53:36.733495 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 08 00:53:36.733868 master-0 kubenswrapper[23041]: I0308 00:53:36.733827 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 08 00:53:36.790936 master-0 kubenswrapper[23041]: I0308 00:53:36.790808 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 00:53:36.886388 master-0 kubenswrapper[23041]: I0308 00:53:36.874347 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e016b3b-b18e-428e-8dd9-88c2e106d04e" path="/var/lib/kubelet/pods/0e016b3b-b18e-428e-8dd9-88c2e106d04e/volumes" Mar 08 00:53:36.895685 master-0 kubenswrapper[23041]: I0308 00:53:36.895595 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-565c7fbf46-lqmmt"] Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.902706 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905368 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4tc\" (UniqueName: \"kubernetes.io/projected/5fd31740-3478-41e5-8295-d4b50f40db04-kube-api-access-6g4tc\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905437 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905460 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8f10eb9b-d44c-4f28-b4e5-ca4c08dc4418\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5967da22-8080-487b-87c8-62fbe8cc2711\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905488 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905506 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905547 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5fd31740-3478-41e5-8295-d4b50f40db04-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905570 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-scripts\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.906037 master-0 kubenswrapper[23041]: I0308 00:53:36.905670 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:36.920437 master-0 kubenswrapper[23041]: I0308 00:53:36.916004 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 08 00:53:36.920437 master-0 kubenswrapper[23041]: I0308 00:53:36.916111 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 08 00:53:36.937012 master-0 kubenswrapper[23041]: I0308 00:53:36.936260 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-565c7fbf46-lqmmt"] Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007500 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-logs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007581 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-scripts\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007639 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-merged\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007662 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007728 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007757 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-custom\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007785 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4tc\" (UniqueName: \"kubernetes.io/projected/5fd31740-3478-41e5-8295-d4b50f40db04-kube-api-access-6g4tc\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007806 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-internal-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.007971 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2898832-7b8c-416b-8a21-04c00f4b188d-etc-podinfo\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008033 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-public-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008067 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008091 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8f10eb9b-d44c-4f28-b4e5-ca4c08dc4418\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5967da22-8080-487b-87c8-62fbe8cc2711\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008112 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-combined-ca-bundle\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008137 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008155 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008177 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-955q7\" (UniqueName: \"kubernetes.io/projected/d2898832-7b8c-416b-8a21-04c00f4b188d-kube-api-access-955q7\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008269 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5fd31740-3478-41e5-8295-d4b50f40db04-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.010286 master-0 kubenswrapper[23041]: I0308 00:53:37.008292 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-scripts\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.012244 master-0 kubenswrapper[23041]: I0308 00:53:37.012172 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.020135 master-0 kubenswrapper[23041]: I0308 00:53:37.020097 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5fd31740-3478-41e5-8295-d4b50f40db04-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.020520 master-0 kubenswrapper[23041]: I0308 00:53:37.020487 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.021165 master-0 kubenswrapper[23041]: I0308 00:53:37.020963 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:53:37.021165 master-0 kubenswrapper[23041]: I0308 00:53:37.020990 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8f10eb9b-d44c-4f28-b4e5-ca4c08dc4418\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5967da22-8080-487b-87c8-62fbe8cc2711\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a935c1ad8b16ba6e0d9ee1cc71ed89ae0fedfb904207a69229b486873f05c013/globalmount\"" pod="openstack/ironic-conductor-0" Mar 08 00:53:37.037735 master-0 kubenswrapper[23041]: I0308 00:53:37.037277 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4tc\" (UniqueName: \"kubernetes.io/projected/5fd31740-3478-41e5-8295-d4b50f40db04-kube-api-access-6g4tc\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.037735 master-0 kubenswrapper[23041]: I0308 00:53:37.037489 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.050783 master-0 kubenswrapper[23041]: I0308 00:53:37.050665 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-scripts\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.065107 master-0 kubenswrapper[23041]: I0308 00:53:37.065055 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd31740-3478-41e5-8295-d4b50f40db04-config-data\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:37.110772 master-0 kubenswrapper[23041]: I0308 00:53:37.110700 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-internal-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.110772 master-0 kubenswrapper[23041]: I0308 00:53:37.110748 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2898832-7b8c-416b-8a21-04c00f4b188d-etc-podinfo\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.110872 master-0 kubenswrapper[23041]: I0308 00:53:37.110792 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-public-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.110872 master-0 kubenswrapper[23041]: I0308 00:53:37.110843 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-combined-ca-bundle\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.110975 master-0 kubenswrapper[23041]: I0308 00:53:37.110873 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-955q7\" (UniqueName: \"kubernetes.io/projected/d2898832-7b8c-416b-8a21-04c00f4b188d-kube-api-access-955q7\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.110975 master-0 kubenswrapper[23041]: I0308 00:53:37.110947 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-logs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.111040 master-0 kubenswrapper[23041]: I0308 00:53:37.110976 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-scripts\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.111040 master-0 kubenswrapper[23041]: I0308 00:53:37.111004 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-merged\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.112154 master-0 kubenswrapper[23041]: I0308 00:53:37.111311 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.112154 master-0 kubenswrapper[23041]: I0308 00:53:37.111394 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-custom\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.112154 master-0 kubenswrapper[23041]: I0308 00:53:37.111637 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-merged\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.112154 master-0 kubenswrapper[23041]: I0308 00:53:37.112081 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2898832-7b8c-416b-8a21-04c00f4b188d-logs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.114445 master-0 kubenswrapper[23041]: I0308 00:53:37.114401 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-combined-ca-bundle\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.117054 master-0 kubenswrapper[23041]: I0308 00:53:37.117019 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data-custom\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.124587 master-0 kubenswrapper[23041]: I0308 00:53:37.124515 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-public-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.125358 master-0 kubenswrapper[23041]: I0308 00:53:37.125322 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-scripts\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.129487 master-0 kubenswrapper[23041]: I0308 00:53:37.129443 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-955q7\" (UniqueName: \"kubernetes.io/projected/d2898832-7b8c-416b-8a21-04c00f4b188d-kube-api-access-955q7\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.132968 master-0 kubenswrapper[23041]: I0308 00:53:37.132699 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2898832-7b8c-416b-8a21-04c00f4b188d-etc-podinfo\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.132968 master-0 kubenswrapper[23041]: I0308 00:53:37.132906 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-internal-tls-certs\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.135542 master-0 kubenswrapper[23041]: I0308 00:53:37.135499 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2898832-7b8c-416b-8a21-04c00f4b188d-config-data\") pod \"ironic-565c7fbf46-lqmmt\" (UID: \"d2898832-7b8c-416b-8a21-04c00f4b188d\") " pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.292799 master-0 kubenswrapper[23041]: I0308 00:53:37.292290 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:37.520665 master-0 kubenswrapper[23041]: I0308 00:53:37.520608 23041 generic.go:334] "Generic (PLEG): container finished" podID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerID="b1c509d3ee7a59378ecd438172682e742ea0bd453f239e6d7b13d5d4718bf09c" exitCode=0 Mar 08 00:53:37.521166 master-0 kubenswrapper[23041]: I0308 00:53:37.520844 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerDied","Data":"b1c509d3ee7a59378ecd438172682e742ea0bd453f239e6d7b13d5d4718bf09c"} Mar 08 00:53:38.029404 master-0 kubenswrapper[23041]: I0308 00:53:38.028863 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:38.040098 master-0 kubenswrapper[23041]: I0308 00:53:38.039974 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:38.067263 master-0 kubenswrapper[23041]: I0308 00:53:38.063904 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171109 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171195 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171253 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171312 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171332 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171355 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171390 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171415 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhg2r\" (UniqueName: \"kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171443 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171475 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171500 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171582 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171612 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171648 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171679 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h44kl\" (UniqueName: \"kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.171684 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.172194 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys" (OuterVolumeSpecName: "sys") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.172266 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run" (OuterVolumeSpecName: "run") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.172271 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.172355 master-0 kubenswrapper[23041]: I0308 00:53:38.172300 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.173067 master-0 kubenswrapper[23041]: I0308 00:53:38.172381 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174102 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts\") pod \"5944b058-fd8b-419b-ba55-b61f85254dec\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174184 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174240 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174260 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174365 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174419 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi\") pod \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\" (UID: \"4c3dcf6e-e826-483b-ae9c-465cb6d2d326\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174440 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom\") pod \"a4aef1ca-0703-4433-84e6-a926cea94033\" (UID: \"a4aef1ca-0703-4433-84e6-a926cea94033\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.174466 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq\") pod \"5944b058-fd8b-419b-ba55-b61f85254dec\" (UID: \"5944b058-fd8b-419b-ba55-b61f85254dec\") " Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175351 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175377 23041 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175394 23041 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175405 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175420 23041 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4aef1ca-0703-4433-84e6-a926cea94033-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175429 23041 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175438 23041 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175406 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.176395 master-0 kubenswrapper[23041]: I0308 00:53:38.175517 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.176913 master-0 kubenswrapper[23041]: I0308 00:53:38.176642 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev" (OuterVolumeSpecName: "dev") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.177858 master-0 kubenswrapper[23041]: I0308 00:53:38.177382 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5944b058-fd8b-419b-ba55-b61f85254dec" (UID: "5944b058-fd8b-419b-ba55-b61f85254dec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:38.177858 master-0 kubenswrapper[23041]: I0308 00:53:38.177451 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.177858 master-0 kubenswrapper[23041]: I0308 00:53:38.177600 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts" (OuterVolumeSpecName: "scripts") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.182898 master-0 kubenswrapper[23041]: I0308 00:53:38.178930 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r" (OuterVolumeSpecName: "kube-api-access-jhg2r") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "kube-api-access-jhg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:38.182898 master-0 kubenswrapper[23041]: I0308 00:53:38.181739 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.193366 master-0 kubenswrapper[23041]: I0308 00:53:38.193311 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.193507 master-0 kubenswrapper[23041]: I0308 00:53:38.193413 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq" (OuterVolumeSpecName: "kube-api-access-8dptq") pod "5944b058-fd8b-419b-ba55-b61f85254dec" (UID: "5944b058-fd8b-419b-ba55-b61f85254dec"). InnerVolumeSpecName "kube-api-access-8dptq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:38.195357 master-0 kubenswrapper[23041]: I0308 00:53:38.194410 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl" (OuterVolumeSpecName: "kube-api-access-h44kl") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "kube-api-access-h44kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:38.195587 master-0 kubenswrapper[23041]: I0308 00:53:38.195563 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts" (OuterVolumeSpecName: "scripts") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.266772 master-0 kubenswrapper[23041]: I0308 00:53:38.266700 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.272283 master-0 kubenswrapper[23041]: I0308 00:53:38.272237 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277822 23041 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277859 23041 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277872 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h44kl\" (UniqueName: \"kubernetes.io/projected/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-kube-api-access-h44kl\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277882 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5944b058-fd8b-419b-ba55-b61f85254dec-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277891 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277901 23041 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277910 23041 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277918 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277926 23041 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277934 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277942 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/5944b058-fd8b-419b-ba55-b61f85254dec-kube-api-access-8dptq\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277950 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277958 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278017 master-0 kubenswrapper[23041]: I0308 00:53:38.277966 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.278905 master-0 kubenswrapper[23041]: I0308 00:53:38.278880 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhg2r\" (UniqueName: \"kubernetes.io/projected/a4aef1ca-0703-4433-84e6-a926cea94033-kube-api-access-jhg2r\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.363808 master-0 kubenswrapper[23041]: I0308 00:53:38.363685 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data" (OuterVolumeSpecName: "config-data") pod "a4aef1ca-0703-4433-84e6-a926cea94033" (UID: "a4aef1ca-0703-4433-84e6-a926cea94033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.380704 master-0 kubenswrapper[23041]: I0308 00:53:38.380654 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4aef1ca-0703-4433-84e6-a926cea94033-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.397643 master-0 kubenswrapper[23041]: I0308 00:53:38.397566 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data" (OuterVolumeSpecName: "config-data") pod "4c3dcf6e-e826-483b-ae9c-465cb6d2d326" (UID: "4c3dcf6e-e826-483b-ae9c-465cb6d2d326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.412675 master-0 kubenswrapper[23041]: I0308 00:53:38.412628 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8f10eb9b-d44c-4f28-b4e5-ca4c08dc4418\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5967da22-8080-487b-87c8-62fbe8cc2711\") pod \"ironic-conductor-0\" (UID: \"5fd31740-3478-41e5-8295-d4b50f40db04\") " pod="openstack/ironic-conductor-0" Mar 08 00:53:38.483022 master-0 kubenswrapper[23041]: I0308 00:53:38.482853 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3dcf6e-e826-483b-ae9c-465cb6d2d326-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.522395 master-0 kubenswrapper[23041]: I0308 00:53:38.522319 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:38.543707 master-0 kubenswrapper[23041]: I0308 00:53:38.540561 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-hzm2x" Mar 08 00:53:38.543707 master-0 kubenswrapper[23041]: I0308 00:53:38.540672 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-hzm2x" event={"ID":"5944b058-fd8b-419b-ba55-b61f85254dec","Type":"ContainerDied","Data":"8482eaf6946769d9181392138942ce81af3fd9032d5d457e55bafb75c41a0d17"} Mar 08 00:53:38.543707 master-0 kubenswrapper[23041]: I0308 00:53:38.540753 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8482eaf6946769d9181392138942ce81af3fd9032d5d457e55bafb75c41a0d17" Mar 08 00:53:38.552079 master-0 kubenswrapper[23041]: I0308 00:53:38.549921 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" event={"ID":"9a83bcf0-62ae-4284-b870-14ba623be2e1","Type":"ContainerStarted","Data":"0d9413b60d51154485a6e003c7e9fbc007d394072de70e65f8c8a2c6a295ce84"} Mar 08 00:53:38.552079 master-0 kubenswrapper[23041]: I0308 00:53:38.551592 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:38.561080 master-0 kubenswrapper[23041]: I0308 00:53:38.561049 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 00:53:38.566158 master-0 kubenswrapper[23041]: I0308 00:53:38.566100 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"854d6a39-df63-4aa0-85db-c8cd640dad73","Type":"ContainerDied","Data":"68e35b56d047967e1039717426e363235c97b2d3e6a153bd2c625504ad520dac"} Mar 08 00:53:38.566274 master-0 kubenswrapper[23041]: I0308 00:53:38.566174 23041 scope.go:117] "RemoveContainer" containerID="a63f3c746658e03a4e6a6cf0c07748b530f0c8e00b2e9e0de2a4023609da55b9" Mar 08 00:53:38.567358 master-0 kubenswrapper[23041]: I0308 00:53:38.566141 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:38.571538 master-0 kubenswrapper[23041]: I0308 00:53:38.571485 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"a4aef1ca-0703-4433-84e6-a926cea94033","Type":"ContainerDied","Data":"1feb921500e02a0006b019bc147e55e9d606e278c5c4e42f15399606585660af"} Mar 08 00:53:38.571702 master-0 kubenswrapper[23041]: I0308 00:53:38.571543 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:38.579264 master-0 kubenswrapper[23041]: I0308 00:53:38.579176 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"4c3dcf6e-e826-483b-ae9c-465cb6d2d326","Type":"ContainerDied","Data":"5e97e5c6fe4d443233c9b82654b811d8075c5bdd4d4ea6d58c7a1097085facf6"} Mar 08 00:53:38.579264 master-0 kubenswrapper[23041]: I0308 00:53:38.579244 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:38.628051 master-0 kubenswrapper[23041]: I0308 00:53:38.610339 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:38.671747 master-0 kubenswrapper[23041]: I0308 00:53:38.671669 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" podStartSLOduration=6.671642494 podStartE2EDuration="6.671642494s" podCreationTimestamp="2026-03-08 00:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:38.655371597 +0000 UTC m=+1324.128208161" watchObservedRunningTime="2026-03-08 00:53:38.671642494 +0000 UTC m=+1324.144479048" Mar 08 00:53:38.681612 master-0 kubenswrapper[23041]: I0308 00:53:38.672106 23041 scope.go:117] "RemoveContainer" containerID="b1c509d3ee7a59378ecd438172682e742ea0bd453f239e6d7b13d5d4718bf09c" Mar 08 00:53:38.682902 master-0 kubenswrapper[23041]: I0308 00:53:38.682862 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 00:53:38.686832 master-0 kubenswrapper[23041]: I0308 00:53:38.686515 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.687701 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.687790 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdw7\" (UniqueName: \"kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.687928 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688120 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688169 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688236 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688247 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm2xp\" (UniqueName: \"kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp\") pod \"c957f6fb-9546-4811-9246-6a1bfa49492e\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688491 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688586 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts\") pod \"c957f6fb-9546-4811-9246-6a1bfa49492e\" (UID: \"c957f6fb-9546-4811-9246-6a1bfa49492e\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688688 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688715 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688740 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.688781 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689031 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689081 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689107 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689189 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689302 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.689364 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme\") pod \"854d6a39-df63-4aa0-85db-c8cd640dad73\" (UID: \"854d6a39-df63-4aa0-85db-c8cd640dad73\") " Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.691081 23041 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.693253 master-0 kubenswrapper[23041]: I0308 00:53:38.691102 23041 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.695007 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp" (OuterVolumeSpecName: "kube-api-access-jm2xp") pod "c957f6fb-9546-4811-9246-6a1bfa49492e" (UID: "c957f6fb-9546-4811-9246-6a1bfa49492e"). InnerVolumeSpecName "kube-api-access-jm2xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.695076 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.699281 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7" (OuterVolumeSpecName: "kube-api-access-6kdw7") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "kube-api-access-6kdw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.699866 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.700270 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c957f6fb-9546-4811-9246-6a1bfa49492e" (UID: "c957f6fb-9546-4811-9246-6a1bfa49492e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.700301 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev" (OuterVolumeSpecName: "dev") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.700322 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run" (OuterVolumeSpecName: "run") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.700528 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys" (OuterVolumeSpecName: "sys") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.702116 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.702177 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.703105 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:53:38.708520 master-0 kubenswrapper[23041]: I0308 00:53:38.706682 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.723775 master-0 kubenswrapper[23041]: I0308 00:53:38.709061 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts" (OuterVolumeSpecName: "scripts") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.725218 master-0 kubenswrapper[23041]: I0308 00:53:38.725073 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:38.759225 master-0 kubenswrapper[23041]: I0308 00:53:38.750460 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:38.788869 master-0 kubenswrapper[23041]: I0308 00:53:38.788566 23041 scope.go:117] "RemoveContainer" containerID="689728ade1555da9aa7ab0a54753eece523deee9ae9728509194f7b404cdf1d5" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.793965 23041 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794003 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794013 23041 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794023 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794031 23041 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794040 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdw7\" (UniqueName: \"kubernetes.io/projected/854d6a39-df63-4aa0-85db-c8cd640dad73-kube-api-access-6kdw7\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794051 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794086 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm2xp\" (UniqueName: \"kubernetes.io/projected/c957f6fb-9546-4811-9246-6a1bfa49492e-kube-api-access-jm2xp\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794096 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c957f6fb-9546-4811-9246-6a1bfa49492e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794104 23041 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794111 23041 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794119 23041 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.794301 master-0 kubenswrapper[23041]: I0308 00:53:38.794127 23041 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/854d6a39-df63-4aa0-85db-c8cd640dad73-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.804290 master-0 kubenswrapper[23041]: I0308 00:53:38.803278 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:38.833291 master-0 kubenswrapper[23041]: I0308 00:53:38.828121 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" path="/var/lib/kubelet/pods/a4aef1ca-0703-4433-84e6-a926cea94033/volumes" Mar 08 00:53:38.840261 master-0 kubenswrapper[23041]: I0308 00:53:38.840159 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:38.857227 master-0 kubenswrapper[23041]: I0308 00:53:38.855561 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:38.859870 master-0 kubenswrapper[23041]: E0308 00:53:38.859825 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="cinder-volume" Mar 08 00:53:38.859870 master-0 kubenswrapper[23041]: I0308 00:53:38.859861 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="cinder-volume" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.859884 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5944b058-fd8b-419b-ba55-b61f85254dec" containerName="mariadb-database-create" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.859891 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="5944b058-fd8b-419b-ba55-b61f85254dec" containerName="mariadb-database-create" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.859906 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="probe" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.859914 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="probe" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.859931 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="cinder-backup" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.859937 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="cinder-backup" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.859958 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="cinder-scheduler" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.859967 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="cinder-scheduler" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.859982 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="probe" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.859989 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="probe" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: E0308 00:53:38.860004 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="probe" Mar 08 00:53:38.860013 master-0 kubenswrapper[23041]: I0308 00:53:38.860010 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="probe" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: E0308 00:53:38.860030 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c957f6fb-9546-4811-9246-6a1bfa49492e" containerName="mariadb-account-create-update" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860037 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c957f6fb-9546-4811-9246-6a1bfa49492e" containerName="mariadb-account-create-update" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860259 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c957f6fb-9546-4811-9246-6a1bfa49492e" containerName="mariadb-account-create-update" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860280 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="cinder-volume" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860296 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="probe" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860323 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4aef1ca-0703-4433-84e6-a926cea94033" containerName="cinder-scheduler" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860336 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="5944b058-fd8b-419b-ba55-b61f85254dec" containerName="mariadb-database-create" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860346 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="cinder-backup" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860357 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" containerName="probe" Mar 08 00:53:38.860532 master-0 kubenswrapper[23041]: I0308 00:53:38.860364 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" containerName="probe" Mar 08 00:53:38.866934 master-0 kubenswrapper[23041]: I0308 00:53:38.861525 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:38.866934 master-0 kubenswrapper[23041]: I0308 00:53:38.864548 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:38.866934 master-0 kubenswrapper[23041]: I0308 00:53:38.866668 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-scheduler-config-data" Mar 08 00:53:38.875262 master-0 kubenswrapper[23041]: I0308 00:53:38.875151 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:38.909619 master-0 kubenswrapper[23041]: I0308 00:53:38.907387 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:38.950740 master-0 kubenswrapper[23041]: I0308 00:53:38.950691 23041 scope.go:117] "RemoveContainer" containerID="7f66ff667ebf8020e0e9fae1948e26561a4a3243fca8f100ab3bead266695768" Mar 08 00:53:38.951710 master-0 kubenswrapper[23041]: I0308 00:53:38.951666 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:38.960234 master-0 kubenswrapper[23041]: I0308 00:53:38.960182 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:38.966023 master-0 kubenswrapper[23041]: I0308 00:53:38.965976 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-volume-lvm-iscsi-config-data" Mar 08 00:53:38.973466 master-0 kubenswrapper[23041]: I0308 00:53:38.971992 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:39.010668 master-0 kubenswrapper[23041]: I0308 00:53:39.010547 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.010668 master-0 kubenswrapper[23041]: I0308 00:53:39.010650 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.010668 master-0 kubenswrapper[23041]: I0308 00:53:39.010671 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.011013 master-0 kubenswrapper[23041]: I0308 00:53:39.010758 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.011013 master-0 kubenswrapper[23041]: I0308 00:53:39.010791 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p5kr\" (UniqueName: \"kubernetes.io/projected/aded6dac-190b-4b4a-89d2-4c7e20488782-kube-api-access-7p5kr\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.011013 master-0 kubenswrapper[23041]: I0308 00:53:39.010844 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aded6dac-190b-4b4a-89d2-4c7e20488782-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.028195 master-0 kubenswrapper[23041]: I0308 00:53:39.027516 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data" (OuterVolumeSpecName: "config-data") pod "854d6a39-df63-4aa0-85db-c8cd640dad73" (UID: "854d6a39-df63-4aa0-85db-c8cd640dad73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:39.041174 master-0 kubenswrapper[23041]: I0308 00:53:39.041035 23041 scope.go:117] "RemoveContainer" containerID="1cca5bbda645178e609a6794a74a552fb2326c7e85251edfc181a52f0fa34e4e" Mar 08 00:53:39.084556 master-0 kubenswrapper[23041]: I0308 00:53:39.076175 23041 scope.go:117] "RemoveContainer" containerID="f2fc0cfd89ae39b165eaf29691bde0b893f56650ecd7d3f8a4de6208711a2dbd" Mar 08 00:53:39.144509 master-0 kubenswrapper[23041]: I0308 00:53:39.144454 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.144509 master-0 kubenswrapper[23041]: I0308 00:53:39.144517 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.144745 master-0 kubenswrapper[23041]: I0308 00:53:39.144535 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.144745 master-0 kubenswrapper[23041]: I0308 00:53:39.144627 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.144745 master-0 kubenswrapper[23041]: I0308 00:53:39.144708 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.144850 master-0 kubenswrapper[23041]: I0308 00:53:39.144742 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.144979 master-0 kubenswrapper[23041]: I0308 00:53:39.144951 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.145042 master-0 kubenswrapper[23041]: I0308 00:53:39.145024 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.145566 master-0 kubenswrapper[23041]: I0308 00:53:39.145493 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.145700 master-0 kubenswrapper[23041]: I0308 00:53:39.145650 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.145795 master-0 kubenswrapper[23041]: I0308 00:53:39.145763 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p5kr\" (UniqueName: \"kubernetes.io/projected/aded6dac-190b-4b4a-89d2-4c7e20488782-kube-api-access-7p5kr\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.145951 master-0 kubenswrapper[23041]: I0308 00:53:39.145916 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146102 master-0 kubenswrapper[23041]: I0308 00:53:39.146021 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146102 master-0 kubenswrapper[23041]: I0308 00:53:39.146081 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aded6dac-190b-4b4a-89d2-4c7e20488782-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.146256 master-0 kubenswrapper[23041]: I0308 00:53:39.146219 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aded6dac-190b-4b4a-89d2-4c7e20488782-etc-machine-id\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.146256 master-0 kubenswrapper[23041]: I0308 00:53:39.146230 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146329 master-0 kubenswrapper[23041]: I0308 00:53:39.146268 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146329 master-0 kubenswrapper[23041]: I0308 00:53:39.146322 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146481 master-0 kubenswrapper[23041]: I0308 00:53:39.146432 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.146661 master-0 kubenswrapper[23041]: I0308 00:53:39.146587 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.154078 master-0 kubenswrapper[23041]: I0308 00:53:39.146740 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.154078 master-0 kubenswrapper[23041]: I0308 00:53:39.146809 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg2zz\" (UniqueName: \"kubernetes.io/projected/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-kube-api-access-dg2zz\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.154078 master-0 kubenswrapper[23041]: I0308 00:53:39.147145 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854d6a39-df63-4aa0-85db-c8cd640dad73-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:39.154078 master-0 kubenswrapper[23041]: I0308 00:53:39.152097 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-scripts\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.154606 master-0 kubenswrapper[23041]: I0308 00:53:39.154569 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.178082 master-0 kubenswrapper[23041]: I0308 00:53:39.174610 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-combined-ca-bundle\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.179390 master-0 kubenswrapper[23041]: I0308 00:53:39.179301 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aded6dac-190b-4b4a-89d2-4c7e20488782-config-data-custom\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.179987 master-0 kubenswrapper[23041]: I0308 00:53:39.179961 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p5kr\" (UniqueName: \"kubernetes.io/projected/aded6dac-190b-4b4a-89d2-4c7e20488782-kube-api-access-7p5kr\") pod \"cinder-675ba-scheduler-0\" (UID: \"aded6dac-190b-4b4a-89d2-4c7e20488782\") " pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.188833 master-0 kubenswrapper[23041]: W0308 00:53:39.188783 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2898832_7b8c_416b_8a21_04c00f4b188d.slice/crio-7fe9a7188995cd6ae2114e0b3e966b8840dc9a9e261e3eb6a22a73721c9d6a2a WatchSource:0}: Error finding container 7fe9a7188995cd6ae2114e0b3e966b8840dc9a9e261e3eb6a22a73721c9d6a2a: Status 404 returned error can't find the container with id 7fe9a7188995cd6ae2114e0b3e966b8840dc9a9e261e3eb6a22a73721c9d6a2a Mar 08 00:53:39.190280 master-0 kubenswrapper[23041]: I0308 00:53:39.190226 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-565c7fbf46-lqmmt"] Mar 08 00:53:39.217821 master-0 kubenswrapper[23041]: I0308 00:53:39.217770 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:39.255953 master-0 kubenswrapper[23041]: I0308 00:53:39.255812 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259099 master-0 kubenswrapper[23041]: I0308 00:53:39.259035 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259268 master-0 kubenswrapper[23041]: I0308 00:53:39.259143 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259342 master-0 kubenswrapper[23041]: I0308 00:53:39.259317 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259405 master-0 kubenswrapper[23041]: I0308 00:53:39.259393 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259534 master-0 kubenswrapper[23041]: I0308 00:53:39.259502 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259630 master-0 kubenswrapper[23041]: I0308 00:53:39.259558 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259701 master-0 kubenswrapper[23041]: I0308 00:53:39.259630 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259820 master-0 kubenswrapper[23041]: I0308 00:53:39.259759 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259820 master-0 kubenswrapper[23041]: I0308 00:53:39.259806 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259948 master-0 kubenswrapper[23041]: I0308 00:53:39.259846 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259948 master-0 kubenswrapper[23041]: I0308 00:53:39.259876 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259948 master-0 kubenswrapper[23041]: I0308 00:53:39.259919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.259948 master-0 kubenswrapper[23041]: I0308 00:53:39.259946 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg2zz\" (UniqueName: \"kubernetes.io/projected/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-kube-api-access-dg2zz\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.260126 master-0 kubenswrapper[23041]: I0308 00:53:39.259976 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.261530 master-0 kubenswrapper[23041]: I0308 00:53:39.261475 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-lib-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.261617 master-0 kubenswrapper[23041]: I0308 00:53:39.261551 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-dev\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.262931 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-sys\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.262987 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-brick\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.262988 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-run\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.261486 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-machine-id\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.263574 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-lib-modules\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.263619 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-var-locks-cinder\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.263645 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-iscsi\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.266393 master-0 kubenswrapper[23041]: I0308 00:53:39.263680 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-etc-nvme\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.278398 master-0 kubenswrapper[23041]: I0308 00:53:39.278186 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-scripts\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.285102 master-0 kubenswrapper[23041]: I0308 00:53:39.282828 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-combined-ca-bundle\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.285102 master-0 kubenswrapper[23041]: I0308 00:53:39.283891 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.285102 master-0 kubenswrapper[23041]: I0308 00:53:39.285019 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg2zz\" (UniqueName: \"kubernetes.io/projected/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-kube-api-access-dg2zz\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.293130 master-0 kubenswrapper[23041]: I0308 00:53:39.292977 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a901bb-d3c4-4ca3-a40c-1de98eb519ad-config-data-custom\") pod \"cinder-675ba-volume-lvm-iscsi-0\" (UID: \"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad\") " pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.373082 master-0 kubenswrapper[23041]: I0308 00:53:39.372279 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:39.427586 master-0 kubenswrapper[23041]: I0308 00:53:39.427155 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 00:53:39.464626 master-0 kubenswrapper[23041]: W0308 00:53:39.447283 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fd31740_3478_41e5_8295_d4b50f40db04.slice/crio-17b42cfbec6e40ebb5deb36cfddd45fc902b9931f7589fcfea27cb9c62aff09a WatchSource:0}: Error finding container 17b42cfbec6e40ebb5deb36cfddd45fc902b9931f7589fcfea27cb9c62aff09a: Status 404 returned error can't find the container with id 17b42cfbec6e40ebb5deb36cfddd45fc902b9931f7589fcfea27cb9c62aff09a Mar 08 00:53:39.500921 master-0 kubenswrapper[23041]: I0308 00:53:39.500799 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:39.528368 master-0 kubenswrapper[23041]: I0308 00:53:39.525581 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:39.568297 master-0 kubenswrapper[23041]: I0308 00:53:39.568258 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:39.570649 master-0 kubenswrapper[23041]: I0308 00:53:39.570618 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573665 master-0 kubenswrapper[23041]: I0308 00:53:39.573623 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-run\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573795 master-0 kubenswrapper[23041]: I0308 00:53:39.573697 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573850 master-0 kubenswrapper[23041]: I0308 00:53:39.573776 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573850 master-0 kubenswrapper[23041]: I0308 00:53:39.573826 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573946 master-0 kubenswrapper[23041]: I0308 00:53:39.573862 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-dev\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.573946 master-0 kubenswrapper[23041]: I0308 00:53:39.573893 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574033 master-0 kubenswrapper[23041]: I0308 00:53:39.573944 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574033 master-0 kubenswrapper[23041]: I0308 00:53:39.573968 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574033 master-0 kubenswrapper[23041]: I0308 00:53:39.573995 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574033 master-0 kubenswrapper[23041]: I0308 00:53:39.574029 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574225 master-0 kubenswrapper[23041]: I0308 00:53:39.574054 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.574225 master-0 kubenswrapper[23041]: I0308 00:53:39.574147 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-sys\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.579410 master-0 kubenswrapper[23041]: I0308 00:53:39.574194 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.587402 master-0 kubenswrapper[23041]: I0308 00:53:39.583184 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.587402 master-0 kubenswrapper[23041]: I0308 00:53:39.583281 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnh9\" (UniqueName: \"kubernetes.io/projected/ac5776c8-7160-4f5e-a858-62ad58cde104-kube-api-access-fdnh9\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.587402 master-0 kubenswrapper[23041]: I0308 00:53:39.582163 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-675ba-backup-config-data" Mar 08 00:53:39.592093 master-0 kubenswrapper[23041]: I0308 00:53:39.592038 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:39.685611 master-0 kubenswrapper[23041]: I0308 00:53:39.681845 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-565c7fbf46-lqmmt" event={"ID":"d2898832-7b8c-416b-8a21-04c00f4b188d","Type":"ContainerStarted","Data":"7fe9a7188995cd6ae2114e0b3e966b8840dc9a9e261e3eb6a22a73721c9d6a2a"} Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.687759 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-sys\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.687838 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.687879 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.687910 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnh9\" (UniqueName: \"kubernetes.io/projected/ac5776c8-7160-4f5e-a858-62ad58cde104-kube-api-access-fdnh9\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.687999 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-run\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688044 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688111 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688155 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688191 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-dev\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688255 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.688672 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.689311 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.689361 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.689416 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.689459 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.689907 master-0 kubenswrapper[23041]: I0308 00:53:39.689687 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-iscsi\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690566 master-0 kubenswrapper[23041]: I0308 00:53:39.690196 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-dev\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690566 master-0 kubenswrapper[23041]: I0308 00:53:39.690438 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690566 master-0 kubenswrapper[23041]: I0308 00:53:39.690463 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"17b42cfbec6e40ebb5deb36cfddd45fc902b9931f7589fcfea27cb9c62aff09a"} Mar 08 00:53:39.690566 master-0 kubenswrapper[23041]: I0308 00:53:39.690470 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-locks-brick\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690566 master-0 kubenswrapper[23041]: I0308 00:53:39.690507 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-run\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690711 master-0 kubenswrapper[23041]: I0308 00:53:39.690645 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-var-lib-cinder\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690711 master-0 kubenswrapper[23041]: I0308 00:53:39.690675 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-lib-modules\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.690711 master-0 kubenswrapper[23041]: I0308 00:53:39.690699 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-sys\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.718116 master-0 kubenswrapper[23041]: I0308 00:53:39.690877 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-nvme\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.718116 master-0 kubenswrapper[23041]: I0308 00:53:39.690962 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac5776c8-7160-4f5e-a858-62ad58cde104-etc-machine-id\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.718116 master-0 kubenswrapper[23041]: I0308 00:53:39.701624 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerStarted","Data":"980e0193fca57a50a8a4c6a02259bd0293fe297a9746c1f73921a999809f92ba"} Mar 08 00:53:39.718116 master-0 kubenswrapper[23041]: I0308 00:53:39.701902 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:39.748715 master-0 kubenswrapper[23041]: I0308 00:53:39.748627 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" event={"ID":"c957f6fb-9546-4811-9246-6a1bfa49492e","Type":"ContainerDied","Data":"b29bb134531228243433c1a1ee141edca7ad196a354b913a4a900166c1a6443c"} Mar 08 00:53:39.748918 master-0 kubenswrapper[23041]: I0308 00:53:39.748744 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29bb134531228243433c1a1ee141edca7ad196a354b913a4a900166c1a6443c" Mar 08 00:53:39.748918 master-0 kubenswrapper[23041]: I0308 00:53:39.748867 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-c3c2-account-create-update-w6k86" Mar 08 00:53:39.762766 master-0 kubenswrapper[23041]: I0308 00:53:39.758074 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerStarted","Data":"fd9fab216de9d30e4f9aa2c8e841efc44475740d2c6c11e428cd72a928c01e9e"} Mar 08 00:53:39.762766 master-0 kubenswrapper[23041]: I0308 00:53:39.761972 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podStartSLOduration=4.129605001 podStartE2EDuration="7.761939699s" podCreationTimestamp="2026-03-08 00:53:32 +0000 UTC" firstStartedPulling="2026-03-08 00:53:35.072339974 +0000 UTC m=+1320.545176528" lastFinishedPulling="2026-03-08 00:53:38.704674682 +0000 UTC m=+1324.177511226" observedRunningTime="2026-03-08 00:53:39.736865757 +0000 UTC m=+1325.209702331" watchObservedRunningTime="2026-03-08 00:53:39.761939699 +0000 UTC m=+1325.234776253" Mar 08 00:53:39.785089 master-0 kubenswrapper[23041]: I0308 00:53:39.781168 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-combined-ca-bundle\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.785089 master-0 kubenswrapper[23041]: I0308 00:53:39.781374 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.785089 master-0 kubenswrapper[23041]: I0308 00:53:39.782514 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-scripts\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.785089 master-0 kubenswrapper[23041]: I0308 00:53:39.783056 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnh9\" (UniqueName: \"kubernetes.io/projected/ac5776c8-7160-4f5e-a858-62ad58cde104-kube-api-access-fdnh9\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.812488 master-0 kubenswrapper[23041]: I0308 00:53:39.799672 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac5776c8-7160-4f5e-a858-62ad58cde104-config-data-custom\") pod \"cinder-675ba-backup-0\" (UID: \"ac5776c8-7160-4f5e-a858-62ad58cde104\") " pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:39.822261 master-0 kubenswrapper[23041]: W0308 00:53:39.818779 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaded6dac_190b_4b4a_89d2_4c7e20488782.slice/crio-2cd7808060093038f8be1a14566432e0d7d3960730d1840df5f0b8a25e0bdbb9 WatchSource:0}: Error finding container 2cd7808060093038f8be1a14566432e0d7d3960730d1840df5f0b8a25e0bdbb9: Status 404 returned error can't find the container with id 2cd7808060093038f8be1a14566432e0d7d3960730d1840df5f0b8a25e0bdbb9 Mar 08 00:53:39.827226 master-0 kubenswrapper[23041]: I0308 00:53:39.826999 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-scheduler-0"] Mar 08 00:53:39.959789 master-0 kubenswrapper[23041]: I0308 00:53:39.959357 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:40.095687 master-0 kubenswrapper[23041]: I0308 00:53:40.095624 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-volume-lvm-iscsi-0"] Mar 08 00:53:40.101951 master-0 kubenswrapper[23041]: W0308 00:53:40.101387 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2a901bb_d3c4_4ca3_a40c_1de98eb519ad.slice/crio-f6fa388c8b30c6e98bb40f7d79c235cbfa1227ae13a53518a27a9a0119f013c6 WatchSource:0}: Error finding container f6fa388c8b30c6e98bb40f7d79c235cbfa1227ae13a53518a27a9a0119f013c6: Status 404 returned error can't find the container with id f6fa388c8b30c6e98bb40f7d79c235cbfa1227ae13a53518a27a9a0119f013c6 Mar 08 00:53:40.639330 master-0 kubenswrapper[23041]: I0308 00:53:40.639082 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-675ba-backup-0"] Mar 08 00:53:40.802471 master-0 kubenswrapper[23041]: I0308 00:53:40.802386 23041 generic.go:334] "Generic (PLEG): container finished" podID="d2898832-7b8c-416b-8a21-04c00f4b188d" containerID="dfdd186766ea84904a489d31bd7c7ccced66b028d5adad1e777fdfbd93d78165" exitCode=0 Mar 08 00:53:40.802674 master-0 kubenswrapper[23041]: I0308 00:53:40.802502 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-565c7fbf46-lqmmt" event={"ID":"d2898832-7b8c-416b-8a21-04c00f4b188d","Type":"ContainerDied","Data":"dfdd186766ea84904a489d31bd7c7ccced66b028d5adad1e777fdfbd93d78165"} Mar 08 00:53:40.832768 master-0 kubenswrapper[23041]: I0308 00:53:40.832703 23041 generic.go:334] "Generic (PLEG): container finished" podID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerID="fd9fab216de9d30e4f9aa2c8e841efc44475740d2c6c11e428cd72a928c01e9e" exitCode=0 Mar 08 00:53:40.838011 master-0 kubenswrapper[23041]: I0308 00:53:40.837255 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3dcf6e-e826-483b-ae9c-465cb6d2d326" path="/var/lib/kubelet/pods/4c3dcf6e-e826-483b-ae9c-465cb6d2d326/volumes" Mar 08 00:53:40.839172 master-0 kubenswrapper[23041]: I0308 00:53:40.839144 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854d6a39-df63-4aa0-85db-c8cd640dad73" path="/var/lib/kubelet/pods/854d6a39-df63-4aa0-85db-c8cd640dad73/volumes" Mar 08 00:53:40.841861 master-0 kubenswrapper[23041]: I0308 00:53:40.839923 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"aded6dac-190b-4b4a-89d2-4c7e20488782","Type":"ContainerStarted","Data":"2cd7808060093038f8be1a14566432e0d7d3960730d1840df5f0b8a25e0bdbb9"} Mar 08 00:53:40.841861 master-0 kubenswrapper[23041]: I0308 00:53:40.839957 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad","Type":"ContainerStarted","Data":"f6fa388c8b30c6e98bb40f7d79c235cbfa1227ae13a53518a27a9a0119f013c6"} Mar 08 00:53:40.841861 master-0 kubenswrapper[23041]: I0308 00:53:40.840066 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerDied","Data":"fd9fab216de9d30e4f9aa2c8e841efc44475740d2c6c11e428cd72a928c01e9e"} Mar 08 00:53:40.841861 master-0 kubenswrapper[23041]: I0308 00:53:40.840083 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"ac5776c8-7160-4f5e-a858-62ad58cde104","Type":"ContainerStarted","Data":"0438e94ffd2d0363972e4e0794df56aa54c76c2b3278340b192c0806d2464ea6"} Mar 08 00:53:41.883234 master-0 kubenswrapper[23041]: I0308 00:53:41.870547 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"ec3fec2d0dc741640034e8a245e707a0ac2ee728609cdd2bafad4e52abf0f483"} Mar 08 00:53:41.906467 master-0 kubenswrapper[23041]: I0308 00:53:41.894861 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"aded6dac-190b-4b4a-89d2-4c7e20488782","Type":"ContainerStarted","Data":"943bcaac3273eca25e1820202a71cecce1512828252ff2dd20ab280e3a3c6fd4"} Mar 08 00:53:41.906467 master-0 kubenswrapper[23041]: I0308 00:53:41.897904 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad","Type":"ContainerStarted","Data":"5b614ef896114abdd732567b2483a3956179eb0890b816b1f489bb46305d29c8"} Mar 08 00:53:41.932231 master-0 kubenswrapper[23041]: I0308 00:53:41.931718 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerStarted","Data":"0b73dc3da2facaad884834e4e3d982f5a7e048f7f84409b02726bbee41d64a4f"} Mar 08 00:53:42.743650 master-0 kubenswrapper[23041]: I0308 00:53:42.743574 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:42.764228 master-0 kubenswrapper[23041]: I0308 00:53:42.760769 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fd7c7bb8d-6cc8x" Mar 08 00:53:43.000231 master-0 kubenswrapper[23041]: I0308 00:53:42.998459 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:43.000231 master-0 kubenswrapper[23041]: I0308 00:53:42.998798 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" containerID="cri-o://360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf" gracePeriod=30 Mar 08 00:53:43.000231 master-0 kubenswrapper[23041]: I0308 00:53:42.999849 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" containerID="cri-o://9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc" gracePeriod=30 Mar 08 00:53:43.022167 master-0 kubenswrapper[23041]: I0308 00:53:43.022110 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-scheduler-0" event={"ID":"aded6dac-190b-4b4a-89d2-4c7e20488782","Type":"ContainerStarted","Data":"8de8b45f06e1c257e862b9e3eaee4d7a8674faa618c377a52323ec6b81858433"} Mar 08 00:53:43.042535 master-0 kubenswrapper[23041]: I0308 00:53:43.036307 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" probeResult="failure" output="Get \"https://10.128.0.220:8778/\": EOF" Mar 08 00:53:43.042535 master-0 kubenswrapper[23041]: I0308 00:53:43.037658 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" probeResult="failure" output="Get \"https://10.128.0.220:8778/\": EOF" Mar 08 00:53:43.042535 master-0 kubenswrapper[23041]: I0308 00:53:43.038323 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" event={"ID":"d2a901bb-d3c4-4ca3-a40c-1de98eb519ad","Type":"ContainerStarted","Data":"fa012cdd078376de0954aa24b208a8dfd341efdb668b068aa91eccd81b94c1c4"} Mar 08 00:53:43.056258 master-0 kubenswrapper[23041]: I0308 00:53:43.055643 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"ac5776c8-7160-4f5e-a858-62ad58cde104","Type":"ContainerStarted","Data":"da4b805b96490a77336717a274ce7286f0a34c671c97a2caaef205cfcb237212"} Mar 08 00:53:43.098277 master-0 kubenswrapper[23041]: I0308 00:53:43.098152 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-scheduler-0" podStartSLOduration=5.098130939 podStartE2EDuration="5.098130939s" podCreationTimestamp="2026-03-08 00:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:43.089619001 +0000 UTC m=+1328.562455555" watchObservedRunningTime="2026-03-08 00:53:43.098130939 +0000 UTC m=+1328.570967513" Mar 08 00:53:43.143346 master-0 kubenswrapper[23041]: I0308 00:53:43.136157 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-565c7fbf46-lqmmt" event={"ID":"d2898832-7b8c-416b-8a21-04c00f4b188d","Type":"ContainerStarted","Data":"e6887778d93e00edd1b282a48908f2cbb5e191d0ea3beb5a4019061bd675a2a2"} Mar 08 00:53:43.177229 master-0 kubenswrapper[23041]: I0308 00:53:43.176998 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" podStartSLOduration=5.176977626 podStartE2EDuration="5.176977626s" podCreationTimestamp="2026-03-08 00:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:43.151305799 +0000 UTC m=+1328.624142373" watchObservedRunningTime="2026-03-08 00:53:43.176977626 +0000 UTC m=+1328.649814180" Mar 08 00:53:43.552917 master-0 kubenswrapper[23041]: I0308 00:53:43.486403 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:53:43.845244 master-0 kubenswrapper[23041]: I0308 00:53:43.834953 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:53:43.918848 master-0 kubenswrapper[23041]: I0308 00:53:43.916509 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="dnsmasq-dns" containerID="cri-o://d2ef1dc98f27539292343db5442637e3e74e48cd923adeb67381f39e1874f2d9" gracePeriod=10 Mar 08 00:53:44.149736 master-0 kubenswrapper[23041]: I0308 00:53:44.149687 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:44.155492 master-0 kubenswrapper[23041]: I0308 00:53:44.155442 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerStarted","Data":"a2890c067aa2cff161b9ee73fb8460cb85ce51beb20450049a1cf9e6debd7b96"} Mar 08 00:53:44.156487 master-0 kubenswrapper[23041]: I0308 00:53:44.156459 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:44.159912 master-0 kubenswrapper[23041]: I0308 00:53:44.159876 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-565c7fbf46-lqmmt" event={"ID":"d2898832-7b8c-416b-8a21-04c00f4b188d","Type":"ContainerStarted","Data":"d9e41c44237d4a3a0c44f1e52234b3a15e3d480c7d886d386d37592e273a04e1"} Mar 08 00:53:44.160070 master-0 kubenswrapper[23041]: I0308 00:53:44.160057 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:44.172777 master-0 kubenswrapper[23041]: I0308 00:53:44.171667 23041 generic.go:334] "Generic (PLEG): container finished" podID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerID="360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf" exitCode=143 Mar 08 00:53:44.172777 master-0 kubenswrapper[23041]: I0308 00:53:44.172720 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerDied","Data":"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf"} Mar 08 00:53:44.197226 master-0 kubenswrapper[23041]: I0308 00:53:44.197156 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-77c9977ddd-2q2jp" Mar 08 00:53:44.219530 master-0 kubenswrapper[23041]: I0308 00:53:44.218737 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:44.288760 master-0 kubenswrapper[23041]: I0308 00:53:44.288688 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-657ddbd5bb-fdfgw" podStartSLOduration=8.218579498 podStartE2EDuration="12.288662943s" podCreationTimestamp="2026-03-08 00:53:32 +0000 UTC" firstStartedPulling="2026-03-08 00:53:34.60199937 +0000 UTC m=+1320.074835924" lastFinishedPulling="2026-03-08 00:53:38.672082815 +0000 UTC m=+1324.144919369" observedRunningTime="2026-03-08 00:53:44.243828207 +0000 UTC m=+1329.716664771" watchObservedRunningTime="2026-03-08 00:53:44.288662943 +0000 UTC m=+1329.761499497" Mar 08 00:53:44.389053 master-0 kubenswrapper[23041]: I0308 00:53:44.388000 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:44.514238 master-0 kubenswrapper[23041]: I0308 00:53:44.508630 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-565c7fbf46-lqmmt" podStartSLOduration=8.508602248 podStartE2EDuration="8.508602248s" podCreationTimestamp="2026-03-08 00:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:44.386996126 +0000 UTC m=+1329.859832690" watchObservedRunningTime="2026-03-08 00:53:44.508602248 +0000 UTC m=+1329.981438822" Mar 08 00:53:45.209011 master-0 kubenswrapper[23041]: I0308 00:53:45.208945 23041 generic.go:334] "Generic (PLEG): container finished" podID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerID="d2ef1dc98f27539292343db5442637e3e74e48cd923adeb67381f39e1874f2d9" exitCode=0 Mar 08 00:53:45.209769 master-0 kubenswrapper[23041]: I0308 00:53:45.209040 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" event={"ID":"4528bfc0-76dc-47be-b4e0-cfddeb378c94","Type":"ContainerDied","Data":"d2ef1dc98f27539292343db5442637e3e74e48cd923adeb67381f39e1874f2d9"} Mar 08 00:53:45.252734 master-0 kubenswrapper[23041]: I0308 00:53:45.252490 23041 generic.go:334] "Generic (PLEG): container finished" podID="a94dba9c-1e25-42ed-b30a-d278979d1de9" containerID="980e0193fca57a50a8a4c6a02259bd0293fe297a9746c1f73921a999809f92ba" exitCode=1 Mar 08 00:53:45.252734 master-0 kubenswrapper[23041]: I0308 00:53:45.252611 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerDied","Data":"980e0193fca57a50a8a4c6a02259bd0293fe297a9746c1f73921a999809f92ba"} Mar 08 00:53:45.253516 master-0 kubenswrapper[23041]: I0308 00:53:45.253441 23041 scope.go:117] "RemoveContainer" containerID="980e0193fca57a50a8a4c6a02259bd0293fe297a9746c1f73921a999809f92ba" Mar 08 00:53:45.282266 master-0 kubenswrapper[23041]: I0308 00:53:45.273251 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-675ba-backup-0" event={"ID":"ac5776c8-7160-4f5e-a858-62ad58cde104","Type":"ContainerStarted","Data":"1c193faf2872380337c4504b69e5bd438daec191b46eaa030e7fa7dc48fe4d63"} Mar 08 00:53:45.294523 master-0 kubenswrapper[23041]: I0308 00:53:45.294084 23041 generic.go:334] "Generic (PLEG): container finished" podID="5fd31740-3478-41e5-8295-d4b50f40db04" containerID="ec3fec2d0dc741640034e8a245e707a0ac2ee728609cdd2bafad4e52abf0f483" exitCode=0 Mar 08 00:53:45.297467 master-0 kubenswrapper[23041]: I0308 00:53:45.296370 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerDied","Data":"ec3fec2d0dc741640034e8a245e707a0ac2ee728609cdd2bafad4e52abf0f483"} Mar 08 00:53:45.378268 master-0 kubenswrapper[23041]: I0308 00:53:45.374302 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-675ba-backup-0" podStartSLOduration=6.374281094 podStartE2EDuration="6.374281094s" podCreationTimestamp="2026-03-08 00:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:45.309411138 +0000 UTC m=+1330.782247682" watchObservedRunningTime="2026-03-08 00:53:45.374281094 +0000 UTC m=+1330.847117648" Mar 08 00:53:45.450228 master-0 kubenswrapper[23041]: I0308 00:53:45.446479 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-675ba-api-0" podUID="946a447d-964c-4693-8923-b712bcc9904c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.128.0.230:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:53:45.458229 master-0 kubenswrapper[23041]: I0308 00:53:45.450914 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544479 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544568 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhws\" (UniqueName: \"kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544681 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544726 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544803 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.561309 master-0 kubenswrapper[23041]: I0308 00:53:45.544922 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc\") pod \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\" (UID: \"4528bfc0-76dc-47be-b4e0-cfddeb378c94\") " Mar 08 00:53:45.574502 master-0 kubenswrapper[23041]: I0308 00:53:45.568635 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws" (OuterVolumeSpecName: "kube-api-access-rqhws") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "kube-api-access-rqhws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:45.656800 master-0 kubenswrapper[23041]: I0308 00:53:45.648054 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhws\" (UniqueName: \"kubernetes.io/projected/4528bfc0-76dc-47be-b4e0-cfddeb378c94-kube-api-access-rqhws\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.669821 master-0 kubenswrapper[23041]: I0308 00:53:45.669759 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config" (OuterVolumeSpecName: "config") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:45.711177 master-0 kubenswrapper[23041]: I0308 00:53:45.711124 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:45.762227 master-0 kubenswrapper[23041]: I0308 00:53:45.753614 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.762227 master-0 kubenswrapper[23041]: I0308 00:53:45.753653 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.778245 master-0 kubenswrapper[23041]: I0308 00:53:45.770215 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: I0308 00:53:45.792441 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: E0308 00:53:45.793046 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="init" Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: I0308 00:53:45.793060 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="init" Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: E0308 00:53:45.793076 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="dnsmasq-dns" Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: I0308 00:53:45.793082 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="dnsmasq-dns" Mar 08 00:53:45.794228 master-0 kubenswrapper[23041]: I0308 00:53:45.793331 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" containerName="dnsmasq-dns" Mar 08 00:53:45.831705 master-0 kubenswrapper[23041]: I0308 00:53:45.827028 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 00:53:45.831705 master-0 kubenswrapper[23041]: I0308 00:53:45.828337 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 00:53:45.832363 master-0 kubenswrapper[23041]: I0308 00:53:45.832319 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 00:53:45.832420 master-0 kubenswrapper[23041]: I0308 00:53:45.832379 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 00:53:45.856232 master-0 kubenswrapper[23041]: I0308 00:53:45.855539 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhm9d\" (UniqueName: \"kubernetes.io/projected/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-kube-api-access-mhm9d\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.856232 master-0 kubenswrapper[23041]: I0308 00:53:45.855594 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.856232 master-0 kubenswrapper[23041]: I0308 00:53:45.855665 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.856232 master-0 kubenswrapper[23041]: I0308 00:53:45.855750 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.856232 master-0 kubenswrapper[23041]: I0308 00:53:45.855910 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.864139 master-0 kubenswrapper[23041]: I0308 00:53:45.863746 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:45.892499 master-0 kubenswrapper[23041]: I0308 00:53:45.891351 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4528bfc0-76dc-47be-b4e0-cfddeb378c94" (UID: "4528bfc0-76dc-47be-b4e0-cfddeb378c94"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.957624 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhm9d\" (UniqueName: \"kubernetes.io/projected/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-kube-api-access-mhm9d\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.957708 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.957780 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.957867 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.958026 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.959339 master-0 kubenswrapper[23041]: I0308 00:53:45.958042 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4528bfc0-76dc-47be-b4e0-cfddeb378c94-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:45.961445 master-0 kubenswrapper[23041]: I0308 00:53:45.961395 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.961841 master-0 kubenswrapper[23041]: I0308 00:53:45.961807 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-openstack-config\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:45.980226 master-0 kubenswrapper[23041]: I0308 00:53:45.978683 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:46.011017 master-0 kubenswrapper[23041]: I0308 00:53:46.010940 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhm9d\" (UniqueName: \"kubernetes.io/projected/7de414ad-fdb9-4bfb-a953-439cbf3fb81e-kube-api-access-mhm9d\") pod \"openstackclient\" (UID: \"7de414ad-fdb9-4bfb-a953-439cbf3fb81e\") " pod="openstack/openstackclient" Mar 08 00:53:46.155726 master-0 kubenswrapper[23041]: I0308 00:53:46.152356 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 00:53:46.355075 master-0 kubenswrapper[23041]: I0308 00:53:46.355020 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" event={"ID":"4528bfc0-76dc-47be-b4e0-cfddeb378c94","Type":"ContainerDied","Data":"d517b3c8ecd709447d94315b13c29e02c13f772acaeeeaafcb6cacac4846b82f"} Mar 08 00:53:46.355075 master-0 kubenswrapper[23041]: I0308 00:53:46.355079 23041 scope.go:117] "RemoveContainer" containerID="d2ef1dc98f27539292343db5442637e3e74e48cd923adeb67381f39e1874f2d9" Mar 08 00:53:46.355739 master-0 kubenswrapper[23041]: I0308 00:53:46.355208 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846459fb55-9x6r8" Mar 08 00:53:46.395157 master-0 kubenswrapper[23041]: I0308 00:53:46.386144 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerStarted","Data":"4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64"} Mar 08 00:53:46.395157 master-0 kubenswrapper[23041]: I0308 00:53:46.387524 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:46.412644 master-0 kubenswrapper[23041]: I0308 00:53:46.411817 23041 generic.go:334] "Generic (PLEG): container finished" podID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerID="a2890c067aa2cff161b9ee73fb8460cb85ce51beb20450049a1cf9e6debd7b96" exitCode=1 Mar 08 00:53:46.427353 master-0 kubenswrapper[23041]: I0308 00:53:46.413138 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerDied","Data":"a2890c067aa2cff161b9ee73fb8460cb85ce51beb20450049a1cf9e6debd7b96"} Mar 08 00:53:46.427353 master-0 kubenswrapper[23041]: I0308 00:53:46.414041 23041 scope.go:117] "RemoveContainer" containerID="a2890c067aa2cff161b9ee73fb8460cb85ce51beb20450049a1cf9e6debd7b96" Mar 08 00:53:46.494234 master-0 kubenswrapper[23041]: I0308 00:53:46.486233 23041 scope.go:117] "RemoveContainer" containerID="6378cbc250d89163b19065cb86a7c55b962ad4b72fb574257d448043fb441b78" Mar 08 00:53:46.649011 master-0 kubenswrapper[23041]: I0308 00:53:46.648816 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:53:46.707303 master-0 kubenswrapper[23041]: I0308 00:53:46.686772 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-846459fb55-9x6r8"] Mar 08 00:53:46.830314 master-0 kubenswrapper[23041]: I0308 00:53:46.830224 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4528bfc0-76dc-47be-b4e0-cfddeb378c94" path="/var/lib/kubelet/pods/4528bfc0-76dc-47be-b4e0-cfddeb378c94/volumes" Mar 08 00:53:46.833831 master-0 kubenswrapper[23041]: I0308 00:53:46.830947 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 00:53:47.466674 master-0 kubenswrapper[23041]: I0308 00:53:47.465509 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7de414ad-fdb9-4bfb-a953-439cbf3fb81e","Type":"ContainerStarted","Data":"98bb4a84303af7ccd77775a8b23520cde1ca911733f73324e09aaead19bde656"} Mar 08 00:53:47.472495 master-0 kubenswrapper[23041]: I0308 00:53:47.470176 23041 generic.go:334] "Generic (PLEG): container finished" podID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerID="ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038" exitCode=1 Mar 08 00:53:47.472760 master-0 kubenswrapper[23041]: I0308 00:53:47.471600 23041 scope.go:117] "RemoveContainer" containerID="ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038" Mar 08 00:53:47.472760 master-0 kubenswrapper[23041]: I0308 00:53:47.472434 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerDied","Data":"ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038"} Mar 08 00:53:47.472760 master-0 kubenswrapper[23041]: I0308 00:53:47.472736 23041 scope.go:117] "RemoveContainer" containerID="a2890c067aa2cff161b9ee73fb8460cb85ce51beb20450049a1cf9e6debd7b96" Mar 08 00:53:47.474484 master-0 kubenswrapper[23041]: E0308 00:53:47.473431 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-657ddbd5bb-fdfgw_openstack(85f7cb75-9466-47eb-bd3a-da17df2b5c2a)\"" pod="openstack/ironic-657ddbd5bb-fdfgw" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" Mar 08 00:53:47.634915 master-0 kubenswrapper[23041]: I0308 00:53:47.634365 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-sksvm"] Mar 08 00:53:47.637883 master-0 kubenswrapper[23041]: I0308 00:53:47.637830 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.640916 master-0 kubenswrapper[23041]: I0308 00:53:47.640865 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 00:53:47.642757 master-0 kubenswrapper[23041]: I0308 00:53:47.641430 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 00:53:47.675614 master-0 kubenswrapper[23041]: I0308 00:53:47.675557 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-sksvm"] Mar 08 00:53:47.736115 master-0 kubenswrapper[23041]: I0308 00:53:47.735632 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736115 master-0 kubenswrapper[23041]: I0308 00:53:47.735744 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zg9z\" (UniqueName: \"kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736115 master-0 kubenswrapper[23041]: I0308 00:53:47.735845 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736115 master-0 kubenswrapper[23041]: I0308 00:53:47.735923 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736115 master-0 kubenswrapper[23041]: I0308 00:53:47.735994 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736704 master-0 kubenswrapper[23041]: I0308 00:53:47.736401 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.736704 master-0 kubenswrapper[23041]: I0308 00:53:47.736500 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.838781 master-0 kubenswrapper[23041]: I0308 00:53:47.838665 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.839525 master-0 kubenswrapper[23041]: I0308 00:53:47.839466 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.839759 master-0 kubenswrapper[23041]: I0308 00:53:47.839674 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.840120 master-0 kubenswrapper[23041]: I0308 00:53:47.840050 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zg9z\" (UniqueName: \"kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.843237 master-0 kubenswrapper[23041]: I0308 00:53:47.842517 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.843237 master-0 kubenswrapper[23041]: I0308 00:53:47.842805 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.846333 master-0 kubenswrapper[23041]: I0308 00:53:47.846198 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.847637 master-0 kubenswrapper[23041]: I0308 00:53:47.847566 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.848009 master-0 kubenswrapper[23041]: I0308 00:53:47.847772 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.848009 master-0 kubenswrapper[23041]: I0308 00:53:47.847935 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.848009 master-0 kubenswrapper[23041]: I0308 00:53:47.847962 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.853727 master-0 kubenswrapper[23041]: I0308 00:53:47.853687 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.886862 master-0 kubenswrapper[23041]: I0308 00:53:47.886777 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zg9z\" (UniqueName: \"kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:47.888103 master-0 kubenswrapper[23041]: I0308 00:53:47.887654 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") pod \"ironic-inspector-db-sync-sksvm\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:48.020674 master-0 kubenswrapper[23041]: I0308 00:53:48.019346 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:53:48.418320 master-0 kubenswrapper[23041]: I0308 00:53:48.416316 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:48.445249 master-0 kubenswrapper[23041]: I0308 00:53:48.442975 23041 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:48.445249 master-0 kubenswrapper[23041]: I0308 00:53:48.443062 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:48.488239 master-0 kubenswrapper[23041]: I0308 00:53:48.488116 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" probeResult="failure" output="Get \"https://10.128.0.220:8778/\": read tcp 10.128.0.2:36074->10.128.0.220:8778: read: connection reset by peer" Mar 08 00:53:48.488821 master-0 kubenswrapper[23041]: I0308 00:53:48.488116 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-76cc655964-lxxvl" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" probeResult="failure" output="Get \"https://10.128.0.220:8778/\": read tcp 10.128.0.2:36068->10.128.0.220:8778: read: connection reset by peer" Mar 08 00:53:48.563245 master-0 kubenswrapper[23041]: I0308 00:53:48.555721 23041 scope.go:117] "RemoveContainer" containerID="ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038" Mar 08 00:53:48.563245 master-0 kubenswrapper[23041]: E0308 00:53:48.555979 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-657ddbd5bb-fdfgw_openstack(85f7cb75-9466-47eb-bd3a-da17df2b5c2a)\"" pod="openstack/ironic-657ddbd5bb-fdfgw" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" Mar 08 00:53:48.564281 master-0 kubenswrapper[23041]: I0308 00:53:48.564014 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-sksvm"] Mar 08 00:53:48.749651 master-0 kubenswrapper[23041]: I0308 00:53:48.746533 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:48.873020 master-0 kubenswrapper[23041]: I0308 00:53:48.872953 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-675ba-api-0" Mar 08 00:53:49.210556 master-0 kubenswrapper[23041]: I0308 00:53:49.210307 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.295875 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296101 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdrq2\" (UniqueName: \"kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296150 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296206 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296350 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296398 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.307317 master-0 kubenswrapper[23041]: I0308 00:53:49.296524 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs\") pod \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\" (UID: \"dda659a2-1e52-4b13-8b9d-401d3fcaf800\") " Mar 08 00:53:49.314994 master-0 kubenswrapper[23041]: I0308 00:53:49.309915 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs" (OuterVolumeSpecName: "logs") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:49.342909 master-0 kubenswrapper[23041]: I0308 00:53:49.338139 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts" (OuterVolumeSpecName: "scripts") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:49.350339 master-0 kubenswrapper[23041]: I0308 00:53:49.346815 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2" (OuterVolumeSpecName: "kube-api-access-cdrq2") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "kube-api-access-cdrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:49.401287 master-0 kubenswrapper[23041]: I0308 00:53:49.401073 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda659a2-1e52-4b13-8b9d-401d3fcaf800-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.401287 master-0 kubenswrapper[23041]: I0308 00:53:49.401159 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdrq2\" (UniqueName: \"kubernetes.io/projected/dda659a2-1e52-4b13-8b9d-401d3fcaf800-kube-api-access-cdrq2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.401287 master-0 kubenswrapper[23041]: I0308 00:53:49.401170 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.449241 master-0 kubenswrapper[23041]: I0308 00:53:49.448484 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:49.490631 master-0 kubenswrapper[23041]: I0308 00:53:49.487532 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data" (OuterVolumeSpecName: "config-data") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:49.506280 master-0 kubenswrapper[23041]: I0308 00:53:49.504062 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.506280 master-0 kubenswrapper[23041]: I0308 00:53:49.504123 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.522971 master-0 kubenswrapper[23041]: I0308 00:53:49.522507 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:49.543898 master-0 kubenswrapper[23041]: I0308 00:53:49.541266 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dda659a2-1e52-4b13-8b9d-401d3fcaf800" (UID: "dda659a2-1e52-4b13-8b9d-401d3fcaf800"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:49.602056 master-0 kubenswrapper[23041]: I0308 00:53:49.601083 23041 generic.go:334] "Generic (PLEG): container finished" podID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerID="9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc" exitCode=0 Mar 08 00:53:49.602056 master-0 kubenswrapper[23041]: I0308 00:53:49.601274 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76cc655964-lxxvl" Mar 08 00:53:49.602281 master-0 kubenswrapper[23041]: I0308 00:53:49.602187 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerDied","Data":"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc"} Mar 08 00:53:49.602281 master-0 kubenswrapper[23041]: I0308 00:53:49.602269 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76cc655964-lxxvl" event={"ID":"dda659a2-1e52-4b13-8b9d-401d3fcaf800","Type":"ContainerDied","Data":"a24542608e30dbbd4d9c68873ded34f607f87481e14cdcd9d45b5024a584b2ec"} Mar 08 00:53:49.602366 master-0 kubenswrapper[23041]: I0308 00:53:49.602293 23041 scope.go:117] "RemoveContainer" containerID="9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc" Mar 08 00:53:49.609141 master-0 kubenswrapper[23041]: I0308 00:53:49.607214 23041 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.609141 master-0 kubenswrapper[23041]: I0308 00:53:49.607266 23041 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dda659a2-1e52-4b13-8b9d-401d3fcaf800-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:49.613818 master-0 kubenswrapper[23041]: I0308 00:53:49.613772 23041 scope.go:117] "RemoveContainer" containerID="ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038" Mar 08 00:53:49.614046 master-0 kubenswrapper[23041]: E0308 00:53:49.614008 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-657ddbd5bb-fdfgw_openstack(85f7cb75-9466-47eb-bd3a-da17df2b5c2a)\"" pod="openstack/ironic-657ddbd5bb-fdfgw" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" Mar 08 00:53:49.614376 master-0 kubenswrapper[23041]: I0308 00:53:49.614346 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-sksvm" event={"ID":"33d52628-a63b-48e6-ac86-d8df7b20a8e9","Type":"ContainerStarted","Data":"6812e5c3e0e27f2ac34a265f93d9ec898ad60376561d3a003ef5ccc302e50d11"} Mar 08 00:53:49.635354 master-0 kubenswrapper[23041]: I0308 00:53:49.630260 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-scheduler-0" Mar 08 00:53:49.646886 master-0 kubenswrapper[23041]: I0308 00:53:49.646538 23041 scope.go:117] "RemoveContainer" containerID="360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf" Mar 08 00:53:49.722232 master-0 kubenswrapper[23041]: I0308 00:53:49.721896 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:49.746941 master-0 kubenswrapper[23041]: I0308 00:53:49.746878 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76cc655964-lxxvl"] Mar 08 00:53:49.776511 master-0 kubenswrapper[23041]: I0308 00:53:49.776460 23041 scope.go:117] "RemoveContainer" containerID="9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc" Mar 08 00:53:49.779562 master-0 kubenswrapper[23041]: E0308 00:53:49.777028 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc\": container with ID starting with 9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc not found: ID does not exist" containerID="9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc" Mar 08 00:53:49.779562 master-0 kubenswrapper[23041]: I0308 00:53:49.777059 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc"} err="failed to get container status \"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc\": rpc error: code = NotFound desc = could not find container \"9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc\": container with ID starting with 9735e8ddc8805560fa3e97dc71eb29456ad289e49e08e0456728c29f4e068ccc not found: ID does not exist" Mar 08 00:53:49.779562 master-0 kubenswrapper[23041]: I0308 00:53:49.777078 23041 scope.go:117] "RemoveContainer" containerID="360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf" Mar 08 00:53:49.779562 master-0 kubenswrapper[23041]: E0308 00:53:49.777315 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf\": container with ID starting with 360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf not found: ID does not exist" containerID="360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf" Mar 08 00:53:49.779562 master-0 kubenswrapper[23041]: I0308 00:53:49.777332 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf"} err="failed to get container status \"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf\": rpc error: code = NotFound desc = could not find container \"360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf\": container with ID starting with 360ea98e790bef4d537b7b519f38ac7d02200079fce92bb8aa2572a221740ebf not found: ID does not exist" Mar 08 00:53:49.823703 master-0 kubenswrapper[23041]: I0308 00:53:49.823631 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-volume-lvm-iscsi-0" Mar 08 00:53:49.959889 master-0 kubenswrapper[23041]: I0308 00:53:49.959720 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:49.974547 master-0 kubenswrapper[23041]: I0308 00:53:49.974489 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-565c7fbf46-lqmmt" Mar 08 00:53:50.083278 master-0 kubenswrapper[23041]: I0308 00:53:50.077321 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:50.298237 master-0 kubenswrapper[23041]: I0308 00:53:50.295152 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-675ba-backup-0" Mar 08 00:53:50.644401 master-0 kubenswrapper[23041]: I0308 00:53:50.643850 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-657ddbd5bb-fdfgw" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api-log" containerID="cri-o://0b73dc3da2facaad884834e4e3d982f5a7e048f7f84409b02726bbee41d64a4f" gracePeriod=60 Mar 08 00:53:50.842441 master-0 kubenswrapper[23041]: I0308 00:53:50.842258 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" path="/var/lib/kubelet/pods/dda659a2-1e52-4b13-8b9d-401d3fcaf800/volumes" Mar 08 00:53:51.662252 master-0 kubenswrapper[23041]: I0308 00:53:51.662189 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79bd95bbf9-vglm6" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.791680 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-856bf8b6f6-t9lvl"] Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: E0308 00:53:51.792192 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.792226 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: E0308 00:53:51.792282 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.792289 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.792487 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-log" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.792521 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda659a2-1e52-4b13-8b9d-401d3fcaf800" containerName="placement-api" Mar 08 00:53:51.800245 master-0 kubenswrapper[23041]: I0308 00:53:51.793677 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.835318 master-0 kubenswrapper[23041]: I0308 00:53:51.823430 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 00:53:51.853291 master-0 kubenswrapper[23041]: I0308 00:53:51.852206 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 00:53:51.853291 master-0 kubenswrapper[23041]: I0308 00:53:51.852463 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946101 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-internal-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946339 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-run-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946395 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-public-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946501 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-config-data\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946527 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-log-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946587 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxtg\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-kube-api-access-vbxtg\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946627 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-combined-ca-bundle\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:51.958235 master-0 kubenswrapper[23041]: I0308 00:53:51.946824 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-etc-swift\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.039241 master-0 kubenswrapper[23041]: I0308 00:53:52.022089 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-856bf8b6f6-t9lvl"] Mar 08 00:53:52.055657 master-0 kubenswrapper[23041]: I0308 00:53:52.055604 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-etc-swift\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055765 master-0 kubenswrapper[23041]: I0308 00:53:52.055663 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-internal-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055765 master-0 kubenswrapper[23041]: I0308 00:53:52.055720 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-run-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055765 master-0 kubenswrapper[23041]: I0308 00:53:52.055746 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-public-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055862 master-0 kubenswrapper[23041]: I0308 00:53:52.055782 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-config-data\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055862 master-0 kubenswrapper[23041]: I0308 00:53:52.055800 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-log-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055862 master-0 kubenswrapper[23041]: I0308 00:53:52.055833 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxtg\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-kube-api-access-vbxtg\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.055862 master-0 kubenswrapper[23041]: I0308 00:53:52.055861 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-combined-ca-bundle\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.067203 master-0 kubenswrapper[23041]: I0308 00:53:52.066782 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-run-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.076614 master-0 kubenswrapper[23041]: I0308 00:53:52.072781 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa53a15c-ef65-4753-a29f-53894c55f42f-log-httpd\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.076614 master-0 kubenswrapper[23041]: I0308 00:53:52.076377 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-config-data\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.099572 master-0 kubenswrapper[23041]: I0308 00:53:52.098038 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-internal-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.118242 master-0 kubenswrapper[23041]: I0308 00:53:52.101872 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-public-tls-certs\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.118242 master-0 kubenswrapper[23041]: I0308 00:53:52.117119 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa53a15c-ef65-4753-a29f-53894c55f42f-combined-ca-bundle\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.139327 master-0 kubenswrapper[23041]: I0308 00:53:52.125505 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-etc-swift\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.180494 master-0 kubenswrapper[23041]: I0308 00:53:52.180444 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxtg\" (UniqueName: \"kubernetes.io/projected/fa53a15c-ef65-4753-a29f-53894c55f42f-kube-api-access-vbxtg\") pod \"swift-proxy-856bf8b6f6-t9lvl\" (UID: \"fa53a15c-ef65-4753-a29f-53894c55f42f\") " pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.185695 master-0 kubenswrapper[23041]: I0308 00:53:52.185657 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:52.186182 master-0 kubenswrapper[23041]: I0308 00:53:52.186152 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78756bd8-c6jzz" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-api" containerID="cri-o://755712859b90d8761fbf147c13e124a50c1f56fa9d9c90abcd9e2282903e91a4" gracePeriod=30 Mar 08 00:53:52.186614 master-0 kubenswrapper[23041]: I0308 00:53:52.186562 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78756bd8-c6jzz" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-httpd" containerID="cri-o://43f3d316ce033f2e3b7eb9dd2cc9c82219374cedb6454734b3b655f66bc9ce28" gracePeriod=30 Mar 08 00:53:52.277900 master-0 kubenswrapper[23041]: I0308 00:53:52.277845 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:53:52.572386 master-0 kubenswrapper[23041]: I0308 00:53:52.572322 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-45pj6"] Mar 08 00:53:52.574748 master-0 kubenswrapper[23041]: I0308 00:53:52.574682 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.623303 master-0 kubenswrapper[23041]: I0308 00:53:52.623230 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8bj\" (UniqueName: \"kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.623643 master-0 kubenswrapper[23041]: I0308 00:53:52.623593 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.628172 master-0 kubenswrapper[23041]: I0308 00:53:52.627101 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-45pj6"] Mar 08 00:53:52.742052 master-0 kubenswrapper[23041]: I0308 00:53:52.741825 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.743500 master-0 kubenswrapper[23041]: I0308 00:53:52.743470 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.743758 master-0 kubenswrapper[23041]: I0308 00:53:52.743711 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8bj\" (UniqueName: \"kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.768530 master-0 kubenswrapper[23041]: I0308 00:53:52.768497 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8bj\" (UniqueName: \"kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj\") pod \"nova-api-db-create-45pj6\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.769544 master-0 kubenswrapper[23041]: I0308 00:53:52.769420 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kg45w"] Mar 08 00:53:52.771385 master-0 kubenswrapper[23041]: I0308 00:53:52.771350 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:52.783776 master-0 kubenswrapper[23041]: I0308 00:53:52.783738 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-da7e-account-create-update-6k64t"] Mar 08 00:53:52.789995 master-0 kubenswrapper[23041]: I0308 00:53:52.789964 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:52.793652 master-0 kubenswrapper[23041]: I0308 00:53:52.793630 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 00:53:52.800611 master-0 kubenswrapper[23041]: I0308 00:53:52.800573 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kg45w"] Mar 08 00:53:52.922562 master-0 kubenswrapper[23041]: I0308 00:53:52.922324 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45pj6" Mar 08 00:53:52.939788 master-0 kubenswrapper[23041]: I0308 00:53:52.939746 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da7e-account-create-update-6k64t"] Mar 08 00:53:52.950681 master-0 kubenswrapper[23041]: I0308 00:53:52.949785 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rztgz"] Mar 08 00:53:52.979013 master-0 kubenswrapper[23041]: I0308 00:53:52.978820 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:52.995374 master-0 kubenswrapper[23041]: I0308 00:53:52.994242 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rztgz"] Mar 08 00:53:53.012772 master-0 kubenswrapper[23041]: I0308 00:53:53.012727 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.013018 master-0 kubenswrapper[23041]: I0308 00:53:53.013003 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c44z\" (UniqueName: \"kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.013314 master-0 kubenswrapper[23041]: I0308 00:53:53.013296 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.013458 master-0 kubenswrapper[23041]: I0308 00:53:53.013442 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmzdr\" (UniqueName: \"kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.013612 master-0 kubenswrapper[23041]: I0308 00:53:53.013597 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.013836 master-0 kubenswrapper[23041]: I0308 00:53:53.013821 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklhp\" (UniqueName: \"kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.071108 master-0 kubenswrapper[23041]: I0308 00:53:53.071005 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-91a3-account-create-update-jwmlg"] Mar 08 00:53:53.073025 master-0 kubenswrapper[23041]: I0308 00:53:53.073001 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.081844 master-0 kubenswrapper[23041]: I0308 00:53:53.080790 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-91a3-account-create-update-jwmlg"] Mar 08 00:53:53.086899 master-0 kubenswrapper[23041]: I0308 00:53:53.083464 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 00:53:53.093139 master-0 kubenswrapper[23041]: I0308 00:53:53.093099 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-40fc-account-create-update-hfwpl"] Mar 08 00:53:53.097671 master-0 kubenswrapper[23041]: I0308 00:53:53.097411 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.105376 master-0 kubenswrapper[23041]: I0308 00:53:53.099411 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 00:53:53.105376 master-0 kubenswrapper[23041]: I0308 00:53:53.102665 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-40fc-account-create-update-hfwpl"] Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.133686 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.133768 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzdr\" (UniqueName: \"kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.133827 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.134202 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklhp\" (UniqueName: \"kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.134460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.134483 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c44z\" (UniqueName: \"kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.134777 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.135433 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.136740 master-0 kubenswrapper[23041]: I0308 00:53:53.136091 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.157624 master-0 kubenswrapper[23041]: I0308 00:53:53.156862 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmzdr\" (UniqueName: \"kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr\") pod \"nova-api-da7e-account-create-update-6k64t\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.158726 master-0 kubenswrapper[23041]: I0308 00:53:53.158682 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c44z\" (UniqueName: \"kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z\") pod \"nova-cell1-db-create-rztgz\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.166615 master-0 kubenswrapper[23041]: I0308 00:53:53.166567 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklhp\" (UniqueName: \"kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp\") pod \"nova-cell0-db-create-kg45w\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.199302 master-0 kubenswrapper[23041]: I0308 00:53:53.196950 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:53:53.217183 master-0 kubenswrapper[23041]: I0308 00:53:53.216257 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:53:53.243399 master-0 kubenswrapper[23041]: I0308 00:53:53.243334 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.243602 master-0 kubenswrapper[23041]: I0308 00:53:53.243437 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vtf\" (UniqueName: \"kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.243602 master-0 kubenswrapper[23041]: I0308 00:53:53.243486 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.243602 master-0 kubenswrapper[23041]: I0308 00:53:53.243509 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.328939 master-0 kubenswrapper[23041]: I0308 00:53:53.328331 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:53:53.347038 master-0 kubenswrapper[23041]: I0308 00:53:53.346342 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.347038 master-0 kubenswrapper[23041]: I0308 00:53:53.346417 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22vtf\" (UniqueName: \"kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.347038 master-0 kubenswrapper[23041]: I0308 00:53:53.346466 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.347038 master-0 kubenswrapper[23041]: I0308 00:53:53.346494 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.351395 master-0 kubenswrapper[23041]: I0308 00:53:53.348177 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.351395 master-0 kubenswrapper[23041]: I0308 00:53:53.348251 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.378363 master-0 kubenswrapper[23041]: I0308 00:53:53.373006 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22vtf\" (UniqueName: \"kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf\") pod \"nova-cell0-91a3-account-create-update-jwmlg\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.397919 master-0 kubenswrapper[23041]: I0308 00:53:53.397772 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2\") pod \"nova-cell1-40fc-account-create-update-hfwpl\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.424498 master-0 kubenswrapper[23041]: I0308 00:53:53.418028 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:53:53.461455 master-0 kubenswrapper[23041]: I0308 00:53:53.461203 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:53:53.672174 master-0 kubenswrapper[23041]: E0308 00:53:53.672049 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.672446 master-0 kubenswrapper[23041]: E0308 00:53:53.672310 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673095 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673274 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673572 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673599 23041 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podUID="a94dba9c-1e25-42ed-b30a-d278979d1de9" containerName="ironic-neutron-agent" Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673888 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" cmd=["/bin/true"] Mar 08 00:53:53.674823 master-0 kubenswrapper[23041]: E0308 00:53:53.673909 23041 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podUID="a94dba9c-1e25-42ed-b30a-d278979d1de9" containerName="ironic-neutron-agent" Mar 08 00:53:54.750560 master-0 kubenswrapper[23041]: I0308 00:53:54.750407 23041 generic.go:334] "Generic (PLEG): container finished" podID="a94dba9c-1e25-42ed-b30a-d278979d1de9" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" exitCode=1 Mar 08 00:53:54.750560 master-0 kubenswrapper[23041]: I0308 00:53:54.750544 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerDied","Data":"4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64"} Mar 08 00:53:54.751136 master-0 kubenswrapper[23041]: I0308 00:53:54.750615 23041 scope.go:117] "RemoveContainer" containerID="980e0193fca57a50a8a4c6a02259bd0293fe297a9746c1f73921a999809f92ba" Mar 08 00:53:54.751530 master-0 kubenswrapper[23041]: I0308 00:53:54.751486 23041 scope.go:117] "RemoveContainer" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" Mar 08 00:53:54.751873 master-0 kubenswrapper[23041]: E0308 00:53:54.751808 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-7dffdc6989-dw4bq_openstack(a94dba9c-1e25-42ed-b30a-d278979d1de9)\"" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podUID="a94dba9c-1e25-42ed-b30a-d278979d1de9" Mar 08 00:53:54.760956 master-0 kubenswrapper[23041]: I0308 00:53:54.760006 23041 generic.go:334] "Generic (PLEG): container finished" podID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerID="0b73dc3da2facaad884834e4e3d982f5a7e048f7f84409b02726bbee41d64a4f" exitCode=143 Mar 08 00:53:54.760956 master-0 kubenswrapper[23041]: I0308 00:53:54.760078 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerDied","Data":"0b73dc3da2facaad884834e4e3d982f5a7e048f7f84409b02726bbee41d64a4f"} Mar 08 00:53:54.762967 master-0 kubenswrapper[23041]: I0308 00:53:54.762939 23041 generic.go:334] "Generic (PLEG): container finished" podID="47c3f888-4804-49d5-859a-73983e7c5414" containerID="43f3d316ce033f2e3b7eb9dd2cc9c82219374cedb6454734b3b655f66bc9ce28" exitCode=0 Mar 08 00:53:54.763036 master-0 kubenswrapper[23041]: I0308 00:53:54.762971 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerDied","Data":"43f3d316ce033f2e3b7eb9dd2cc9c82219374cedb6454734b3b655f66bc9ce28"} Mar 08 00:53:55.794717 master-0 kubenswrapper[23041]: I0308 00:53:55.787266 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-657ddbd5bb-fdfgw" event={"ID":"85f7cb75-9466-47eb-bd3a-da17df2b5c2a","Type":"ContainerDied","Data":"54701ef6952f63b967319cc4bc8a773790e8135f554f1520eea7fccbb70bdcf4"} Mar 08 00:53:55.794717 master-0 kubenswrapper[23041]: I0308 00:53:55.787317 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54701ef6952f63b967319cc4bc8a773790e8135f554f1520eea7fccbb70bdcf4" Mar 08 00:53:56.205866 master-0 kubenswrapper[23041]: I0308 00:53:56.205822 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.381707 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.381960 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382004 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382035 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382071 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382123 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382185 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382257 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glqxz\" (UniqueName: \"kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz\") pod \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\" (UID: \"85f7cb75-9466-47eb-bd3a-da17df2b5c2a\") " Mar 08 00:53:56.382717 master-0 kubenswrapper[23041]: I0308 00:53:56.382545 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs" (OuterVolumeSpecName: "logs") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:56.383454 master-0 kubenswrapper[23041]: I0308 00:53:56.383251 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.388235 master-0 kubenswrapper[23041]: I0308 00:53:56.388167 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:53:56.389937 master-0 kubenswrapper[23041]: I0308 00:53:56.389829 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz" (OuterVolumeSpecName: "kube-api-access-glqxz") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "kube-api-access-glqxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:56.394788 master-0 kubenswrapper[23041]: I0308 00:53:56.394696 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 00:53:56.395624 master-0 kubenswrapper[23041]: I0308 00:53:56.395503 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:56.422586 master-0 kubenswrapper[23041]: I0308 00:53:56.422506 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts" (OuterVolumeSpecName: "scripts") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:56.459416 master-0 kubenswrapper[23041]: I0308 00:53:56.459339 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data" (OuterVolumeSpecName: "config-data") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.485994 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.486051 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.486067 23041 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.486079 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.486091 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glqxz\" (UniqueName: \"kubernetes.io/projected/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-kube-api-access-glqxz\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.486148 master-0 kubenswrapper[23041]: I0308 00:53:56.486105 23041 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.640567 master-0 kubenswrapper[23041]: I0308 00:53:56.640389 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f7cb75-9466-47eb-bd3a-da17df2b5c2a" (UID: "85f7cb75-9466-47eb-bd3a-da17df2b5c2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:56.693711 master-0 kubenswrapper[23041]: I0308 00:53:56.692240 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7cb75-9466-47eb-bd3a-da17df2b5c2a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:56.803785 master-0 kubenswrapper[23041]: I0308 00:53:56.803635 23041 generic.go:334] "Generic (PLEG): container finished" podID="47c3f888-4804-49d5-859a-73983e7c5414" containerID="755712859b90d8761fbf147c13e124a50c1f56fa9d9c90abcd9e2282903e91a4" exitCode=0 Mar 08 00:53:56.803785 master-0 kubenswrapper[23041]: I0308 00:53:56.803713 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerDied","Data":"755712859b90d8761fbf147c13e124a50c1f56fa9d9c90abcd9e2282903e91a4"} Mar 08 00:53:56.815348 master-0 kubenswrapper[23041]: I0308 00:53:56.815302 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-657ddbd5bb-fdfgw" Mar 08 00:53:56.850280 master-0 kubenswrapper[23041]: I0308 00:53:56.850200 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-sksvm" podStartSLOduration=2.571588587 podStartE2EDuration="9.850132741s" podCreationTimestamp="2026-03-08 00:53:47 +0000 UTC" firstStartedPulling="2026-03-08 00:53:48.611358011 +0000 UTC m=+1334.084194565" lastFinishedPulling="2026-03-08 00:53:55.889902165 +0000 UTC m=+1341.362738719" observedRunningTime="2026-03-08 00:53:56.835529364 +0000 UTC m=+1342.308365918" watchObservedRunningTime="2026-03-08 00:53:56.850132741 +0000 UTC m=+1342.322969295" Mar 08 00:53:56.873253 master-0 kubenswrapper[23041]: I0308 00:53:56.872562 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-sksvm" event={"ID":"33d52628-a63b-48e6-ac86-d8df7b20a8e9","Type":"ContainerStarted","Data":"aeb465b11b699ea8b8b910a82242ec60e620aa6f8e2764e1890479487625d414"} Mar 08 00:53:56.973656 master-0 kubenswrapper[23041]: I0308 00:53:56.973282 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:57.023458 master-0 kubenswrapper[23041]: I0308 00:53:57.023407 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-657ddbd5bb-fdfgw"] Mar 08 00:53:57.291468 master-0 kubenswrapper[23041]: W0308 00:53:57.291417 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54db0dbf_0b0a_4fdd_9811_876d166896ea.slice/crio-fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3 WatchSource:0}: Error finding container fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3: Status 404 returned error can't find the container with id fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3 Mar 08 00:53:57.293272 master-0 kubenswrapper[23041]: W0308 00:53:57.293178 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f57766_45a4_4e34_918e_d730a59edecd.slice/crio-928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050 WatchSource:0}: Error finding container 928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050: Status 404 returned error can't find the container with id 928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050 Mar 08 00:53:57.316945 master-0 kubenswrapper[23041]: I0308 00:53:57.316866 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kg45w"] Mar 08 00:53:57.346082 master-0 kubenswrapper[23041]: I0308 00:53:57.345997 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rztgz"] Mar 08 00:53:57.391445 master-0 kubenswrapper[23041]: I0308 00:53:57.391383 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da7e-account-create-update-6k64t"] Mar 08 00:53:57.434334 master-0 kubenswrapper[23041]: I0308 00:53:57.424132 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-40fc-account-create-update-hfwpl"] Mar 08 00:53:57.439169 master-0 kubenswrapper[23041]: I0308 00:53:57.437489 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-856bf8b6f6-t9lvl"] Mar 08 00:53:57.530901 master-0 kubenswrapper[23041]: I0308 00:53:57.523472 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:57.600542 master-0 kubenswrapper[23041]: I0308 00:53:57.600167 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-91a3-account-create-update-jwmlg"] Mar 08 00:53:57.640831 master-0 kubenswrapper[23041]: I0308 00:53:57.639267 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-45pj6"] Mar 08 00:53:57.645681 master-0 kubenswrapper[23041]: I0308 00:53:57.645632 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config\") pod \"47c3f888-4804-49d5-859a-73983e7c5414\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " Mar 08 00:53:57.645852 master-0 kubenswrapper[23041]: I0308 00:53:57.645771 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs\") pod \"47c3f888-4804-49d5-859a-73983e7c5414\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " Mar 08 00:53:57.645932 master-0 kubenswrapper[23041]: I0308 00:53:57.645910 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf9p2\" (UniqueName: \"kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2\") pod \"47c3f888-4804-49d5-859a-73983e7c5414\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " Mar 08 00:53:57.646020 master-0 kubenswrapper[23041]: I0308 00:53:57.646001 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config\") pod \"47c3f888-4804-49d5-859a-73983e7c5414\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " Mar 08 00:53:57.646113 master-0 kubenswrapper[23041]: I0308 00:53:57.646092 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle\") pod \"47c3f888-4804-49d5-859a-73983e7c5414\" (UID: \"47c3f888-4804-49d5-859a-73983e7c5414\") " Mar 08 00:53:57.658925 master-0 kubenswrapper[23041]: I0308 00:53:57.658880 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2" (OuterVolumeSpecName: "kube-api-access-tf9p2") pod "47c3f888-4804-49d5-859a-73983e7c5414" (UID: "47c3f888-4804-49d5-859a-73983e7c5414"). InnerVolumeSpecName "kube-api-access-tf9p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:53:57.712026 master-0 kubenswrapper[23041]: I0308 00:53:57.711949 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "47c3f888-4804-49d5-859a-73983e7c5414" (UID: "47c3f888-4804-49d5-859a-73983e7c5414"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:57.753885 master-0 kubenswrapper[23041]: I0308 00:53:57.753844 23041 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:57.754100 master-0 kubenswrapper[23041]: I0308 00:53:57.754085 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf9p2\" (UniqueName: \"kubernetes.io/projected/47c3f888-4804-49d5-859a-73983e7c5414-kube-api-access-tf9p2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:57.844427 master-0 kubenswrapper[23041]: I0308 00:53:57.842572 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45pj6" event={"ID":"9b782b43-8b7d-487d-9ded-698b28da172d","Type":"ContainerStarted","Data":"01fd064e5eeaa379bcee8a70638149537f90db4bf4d3e8b4bbf1a531d9e18cd1"} Mar 08 00:53:57.852178 master-0 kubenswrapper[23041]: I0308 00:53:57.852119 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kg45w" event={"ID":"54db0dbf-0b0a-4fdd-9811-876d166896ea","Type":"ContainerStarted","Data":"fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3"} Mar 08 00:53:57.865470 master-0 kubenswrapper[23041]: I0308 00:53:57.865340 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" event={"ID":"fa53a15c-ef65-4753-a29f-53894c55f42f","Type":"ContainerStarted","Data":"c5bbe8ac5c160cd428ce8cad7f087ffa26e1eb4a97c82cdf02514902a436a7ac"} Mar 08 00:53:57.894764 master-0 kubenswrapper[23041]: I0308 00:53:57.877678 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rztgz" event={"ID":"51f57766-45a4-4e34-918e-d730a59edecd","Type":"ContainerStarted","Data":"570fa8ba9de1434e429a43a4870f6d406ac7e270c7555d159e17a3923a1b7f50"} Mar 08 00:53:57.894764 master-0 kubenswrapper[23041]: I0308 00:53:57.877756 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rztgz" event={"ID":"51f57766-45a4-4e34-918e-d730a59edecd","Type":"ContainerStarted","Data":"928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050"} Mar 08 00:53:57.905019 master-0 kubenswrapper[23041]: I0308 00:53:57.904795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78756bd8-c6jzz" event={"ID":"47c3f888-4804-49d5-859a-73983e7c5414","Type":"ContainerDied","Data":"d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7"} Mar 08 00:53:57.905019 master-0 kubenswrapper[23041]: I0308 00:53:57.904835 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78756bd8-c6jzz" Mar 08 00:53:57.905019 master-0 kubenswrapper[23041]: I0308 00:53:57.904867 23041 scope.go:117] "RemoveContainer" containerID="43f3d316ce033f2e3b7eb9dd2cc9c82219374cedb6454734b3b655f66bc9ce28" Mar 08 00:53:57.912002 master-0 kubenswrapper[23041]: I0308 00:53:57.911164 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-rztgz" podStartSLOduration=5.91114025 podStartE2EDuration="5.91114025s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:57.90255458 +0000 UTC m=+1343.375391124" watchObservedRunningTime="2026-03-08 00:53:57.91114025 +0000 UTC m=+1343.383976804" Mar 08 00:53:57.912578 master-0 kubenswrapper[23041]: I0308 00:53:57.912038 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" event={"ID":"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524","Type":"ContainerStarted","Data":"e3db896314ad04b3e55b47c98cd9748581a65d483cbea8a2fa43c89f181e4021"} Mar 08 00:53:57.914285 master-0 kubenswrapper[23041]: I0308 00:53:57.914233 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da7e-account-create-update-6k64t" event={"ID":"cf3fafa4-62ca-4b35-bd19-00651b4a9d48","Type":"ContainerStarted","Data":"87ba0c745bb2c515c82c84abb4a2f652aca9b78eb1267dcd699c4cae92b7a55f"} Mar 08 00:53:57.917087 master-0 kubenswrapper[23041]: I0308 00:53:57.916929 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" event={"ID":"7c8614ce-5cee-41f4-a083-9e7d67b46633","Type":"ContainerStarted","Data":"57922bcc7c8c1b6d804a056d53e80b8e92b2b111f9226bf0d0ab16638eb0d41f"} Mar 08 00:53:57.980082 master-0 kubenswrapper[23041]: I0308 00:53:57.979790 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47c3f888-4804-49d5-859a-73983e7c5414" (UID: "47c3f888-4804-49d5-859a-73983e7c5414"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:58.063155 master-0 kubenswrapper[23041]: I0308 00:53:58.063017 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:58.078544 master-0 kubenswrapper[23041]: I0308 00:53:58.078485 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config" (OuterVolumeSpecName: "config") pod "47c3f888-4804-49d5-859a-73983e7c5414" (UID: "47c3f888-4804-49d5-859a-73983e7c5414"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:58.133477 master-0 kubenswrapper[23041]: I0308 00:53:58.133399 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "47c3f888-4804-49d5-859a-73983e7c5414" (UID: "47c3f888-4804-49d5-859a-73983e7c5414"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:53:58.185289 master-0 kubenswrapper[23041]: I0308 00:53:58.176577 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:58.185289 master-0 kubenswrapper[23041]: I0308 00:53:58.176625 23041 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c3f888-4804-49d5-859a-73983e7c5414-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:53:58.203870 master-0 kubenswrapper[23041]: I0308 00:53:58.201923 23041 scope.go:117] "RemoveContainer" containerID="755712859b90d8761fbf147c13e124a50c1f56fa9d9c90abcd9e2282903e91a4" Mar 08 00:53:58.276844 master-0 kubenswrapper[23041]: I0308 00:53:58.276716 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:58.292460 master-0 kubenswrapper[23041]: I0308 00:53:58.289535 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78756bd8-c6jzz"] Mar 08 00:53:58.389315 master-0 kubenswrapper[23041]: E0308 00:53:58.389237 23041 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47c3f888_4804_49d5_859a_73983e7c5414.slice/crio-d2912b3f5fdf83b455b6dde8396ead89c07d1b8442d19b8e891ffe9e63480bd7\": RecentStats: unable to find data in memory cache]" Mar 08 00:53:58.673377 master-0 kubenswrapper[23041]: I0308 00:53:58.672406 23041 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:58.673377 master-0 kubenswrapper[23041]: I0308 00:53:58.673249 23041 scope.go:117] "RemoveContainer" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" Mar 08 00:53:58.673667 master-0 kubenswrapper[23041]: E0308 00:53:58.673483 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-7dffdc6989-dw4bq_openstack(a94dba9c-1e25-42ed-b30a-d278979d1de9)\"" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podUID="a94dba9c-1e25-42ed-b30a-d278979d1de9" Mar 08 00:53:58.673667 master-0 kubenswrapper[23041]: I0308 00:53:58.673528 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:53:58.833914 master-0 kubenswrapper[23041]: I0308 00:53:58.833199 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c3f888-4804-49d5-859a-73983e7c5414" path="/var/lib/kubelet/pods/47c3f888-4804-49d5-859a-73983e7c5414/volumes" Mar 08 00:53:58.836603 master-0 kubenswrapper[23041]: I0308 00:53:58.834684 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" path="/var/lib/kubelet/pods/85f7cb75-9466-47eb-bd3a-da17df2b5c2a/volumes" Mar 08 00:53:58.959991 master-0 kubenswrapper[23041]: I0308 00:53:58.959555 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" event={"ID":"fa53a15c-ef65-4753-a29f-53894c55f42f","Type":"ContainerStarted","Data":"4be2472f77993bf211860aead251034cc072e2bac2722bf12ee6a95a7c8278f4"} Mar 08 00:53:58.972881 master-0 kubenswrapper[23041]: I0308 00:53:58.972310 23041 generic.go:334] "Generic (PLEG): container finished" podID="51f57766-45a4-4e34-918e-d730a59edecd" containerID="570fa8ba9de1434e429a43a4870f6d406ac7e270c7555d159e17a3923a1b7f50" exitCode=0 Mar 08 00:53:58.972881 master-0 kubenswrapper[23041]: I0308 00:53:58.972427 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rztgz" event={"ID":"51f57766-45a4-4e34-918e-d730a59edecd","Type":"ContainerDied","Data":"570fa8ba9de1434e429a43a4870f6d406ac7e270c7555d159e17a3923a1b7f50"} Mar 08 00:53:58.984487 master-0 kubenswrapper[23041]: I0308 00:53:58.982262 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" event={"ID":"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524","Type":"ContainerStarted","Data":"944d552255fede5544aa89290745d562cd790f384fd73998a81990e5eff9877e"} Mar 08 00:53:58.985014 master-0 kubenswrapper[23041]: I0308 00:53:58.984936 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da7e-account-create-update-6k64t" event={"ID":"cf3fafa4-62ca-4b35-bd19-00651b4a9d48","Type":"ContainerStarted","Data":"9ac500683606cdb5164248aecf601e684df7d1f2b89999be77816e61f2b391f9"} Mar 08 00:53:59.003919 master-0 kubenswrapper[23041]: I0308 00:53:59.003864 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" event={"ID":"7c8614ce-5cee-41f4-a083-9e7d67b46633","Type":"ContainerStarted","Data":"77098c5727b88fdac0ce58e504d450fb8262c8c8f6fabbc2f127b2bdfff72fbb"} Mar 08 00:53:59.010907 master-0 kubenswrapper[23041]: I0308 00:53:59.010859 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45pj6" event={"ID":"9b782b43-8b7d-487d-9ded-698b28da172d","Type":"ContainerStarted","Data":"94dc6c5d1ba5377cbdc2674baf891cc2b564e03093f188a475d0505af3bd7796"} Mar 08 00:53:59.021435 master-0 kubenswrapper[23041]: I0308 00:53:59.021365 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" podStartSLOduration=7.021345111 podStartE2EDuration="7.021345111s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:59.020293595 +0000 UTC m=+1344.493130169" watchObservedRunningTime="2026-03-08 00:53:59.021345111 +0000 UTC m=+1344.494181665" Mar 08 00:53:59.030406 master-0 kubenswrapper[23041]: I0308 00:53:59.029661 23041 scope.go:117] "RemoveContainer" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" Mar 08 00:53:59.030406 master-0 kubenswrapper[23041]: E0308 00:53:59.029986 23041 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-7dffdc6989-dw4bq_openstack(a94dba9c-1e25-42ed-b30a-d278979d1de9)\"" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" podUID="a94dba9c-1e25-42ed-b30a-d278979d1de9" Mar 08 00:53:59.030406 master-0 kubenswrapper[23041]: I0308 00:53:59.030356 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kg45w" event={"ID":"54db0dbf-0b0a-4fdd-9811-876d166896ea","Type":"ContainerStarted","Data":"1d8ad32a2cab1f3ce51692bf35ec05f10f521bf759deaa403ac00e483056b022"} Mar 08 00:53:59.074628 master-0 kubenswrapper[23041]: I0308 00:53:59.073272 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-da7e-account-create-update-6k64t" podStartSLOduration=7.073251339 podStartE2EDuration="7.073251339s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:59.047723605 +0000 UTC m=+1344.520560159" watchObservedRunningTime="2026-03-08 00:53:59.073251339 +0000 UTC m=+1344.546087893" Mar 08 00:53:59.103568 master-0 kubenswrapper[23041]: I0308 00:53:59.103479 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-kg45w" podStartSLOduration=7.103462828 podStartE2EDuration="7.103462828s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:59.075270238 +0000 UTC m=+1344.548106802" watchObservedRunningTime="2026-03-08 00:53:59.103462828 +0000 UTC m=+1344.576299382" Mar 08 00:53:59.104575 master-0 kubenswrapper[23041]: I0308 00:53:59.104536 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-45pj6" podStartSLOduration=7.104529014 podStartE2EDuration="7.104529014s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:59.096686232 +0000 UTC m=+1344.569522796" watchObservedRunningTime="2026-03-08 00:53:59.104529014 +0000 UTC m=+1344.577365568" Mar 08 00:53:59.228271 master-0 kubenswrapper[23041]: I0308 00:53:59.228092 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" podStartSLOduration=7.228073193 podStartE2EDuration="7.228073193s" podCreationTimestamp="2026-03-08 00:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:53:59.141413195 +0000 UTC m=+1344.614249749" watchObservedRunningTime="2026-03-08 00:53:59.228073193 +0000 UTC m=+1344.700909747" Mar 08 00:54:06.164677 master-0 kubenswrapper[23041]: I0308 00:54:06.164612 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" event={"ID":"fa53a15c-ef65-4753-a29f-53894c55f42f","Type":"ContainerStarted","Data":"58eb91d2885499e76d0db8394c80106db7dba6642f5bb0608394625a9ea31996"} Mar 08 00:54:06.166569 master-0 kubenswrapper[23041]: I0308 00:54:06.166537 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:54:06.166569 master-0 kubenswrapper[23041]: I0308 00:54:06.166564 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:54:06.175035 master-0 kubenswrapper[23041]: I0308 00:54:06.174885 23041 generic.go:334] "Generic (PLEG): container finished" podID="2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" containerID="944d552255fede5544aa89290745d562cd790f384fd73998a81990e5eff9877e" exitCode=0 Mar 08 00:54:06.175035 master-0 kubenswrapper[23041]: I0308 00:54:06.174973 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" event={"ID":"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524","Type":"ContainerDied","Data":"944d552255fede5544aa89290745d562cd790f384fd73998a81990e5eff9877e"} Mar 08 00:54:06.177787 master-0 kubenswrapper[23041]: I0308 00:54:06.177748 23041 generic.go:334] "Generic (PLEG): container finished" podID="cf3fafa4-62ca-4b35-bd19-00651b4a9d48" containerID="9ac500683606cdb5164248aecf601e684df7d1f2b89999be77816e61f2b391f9" exitCode=0 Mar 08 00:54:06.177862 master-0 kubenswrapper[23041]: I0308 00:54:06.177801 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da7e-account-create-update-6k64t" event={"ID":"cf3fafa4-62ca-4b35-bd19-00651b4a9d48","Type":"ContainerDied","Data":"9ac500683606cdb5164248aecf601e684df7d1f2b89999be77816e61f2b391f9"} Mar 08 00:54:06.180985 master-0 kubenswrapper[23041]: I0308 00:54:06.180938 23041 generic.go:334] "Generic (PLEG): container finished" podID="7c8614ce-5cee-41f4-a083-9e7d67b46633" containerID="77098c5727b88fdac0ce58e504d450fb8262c8c8f6fabbc2f127b2bdfff72fbb" exitCode=0 Mar 08 00:54:06.181057 master-0 kubenswrapper[23041]: I0308 00:54:06.181010 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" event={"ID":"7c8614ce-5cee-41f4-a083-9e7d67b46633","Type":"ContainerDied","Data":"77098c5727b88fdac0ce58e504d450fb8262c8c8f6fabbc2f127b2bdfff72fbb"} Mar 08 00:54:06.183918 master-0 kubenswrapper[23041]: I0308 00:54:06.183691 23041 generic.go:334] "Generic (PLEG): container finished" podID="9b782b43-8b7d-487d-9ded-698b28da172d" containerID="94dc6c5d1ba5377cbdc2674baf891cc2b564e03093f188a475d0505af3bd7796" exitCode=0 Mar 08 00:54:06.184005 master-0 kubenswrapper[23041]: I0308 00:54:06.183931 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45pj6" event={"ID":"9b782b43-8b7d-487d-9ded-698b28da172d","Type":"ContainerDied","Data":"94dc6c5d1ba5377cbdc2674baf891cc2b564e03093f188a475d0505af3bd7796"} Mar 08 00:54:06.186003 master-0 kubenswrapper[23041]: I0308 00:54:06.185896 23041 generic.go:334] "Generic (PLEG): container finished" podID="33d52628-a63b-48e6-ac86-d8df7b20a8e9" containerID="aeb465b11b699ea8b8b910a82242ec60e620aa6f8e2764e1890479487625d414" exitCode=0 Mar 08 00:54:06.186277 master-0 kubenswrapper[23041]: I0308 00:54:06.185965 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-sksvm" event={"ID":"33d52628-a63b-48e6-ac86-d8df7b20a8e9","Type":"ContainerDied","Data":"aeb465b11b699ea8b8b910a82242ec60e620aa6f8e2764e1890479487625d414"} Mar 08 00:54:06.191221 master-0 kubenswrapper[23041]: I0308 00:54:06.187315 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podUID="fa53a15c-ef65-4753-a29f-53894c55f42f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 00:54:06.195812 master-0 kubenswrapper[23041]: I0308 00:54:06.194518 23041 generic.go:334] "Generic (PLEG): container finished" podID="54db0dbf-0b0a-4fdd-9811-876d166896ea" containerID="1d8ad32a2cab1f3ce51692bf35ec05f10f521bf759deaa403ac00e483056b022" exitCode=0 Mar 08 00:54:06.195812 master-0 kubenswrapper[23041]: I0308 00:54:06.194583 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kg45w" event={"ID":"54db0dbf-0b0a-4fdd-9811-876d166896ea","Type":"ContainerDied","Data":"1d8ad32a2cab1f3ce51692bf35ec05f10f521bf759deaa403ac00e483056b022"} Mar 08 00:54:06.719161 master-0 kubenswrapper[23041]: I0308 00:54:06.719092 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podStartSLOduration=15.719050968 podStartE2EDuration="15.719050968s" podCreationTimestamp="2026-03-08 00:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:06.713245296 +0000 UTC m=+1352.186081880" watchObservedRunningTime="2026-03-08 00:54:06.719050968 +0000 UTC m=+1352.191887522" Mar 08 00:54:07.218898 master-0 kubenswrapper[23041]: I0308 00:54:07.218764 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podUID="fa53a15c-ef65-4753-a29f-53894c55f42f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 00:54:07.298300 master-0 kubenswrapper[23041]: I0308 00:54:07.296883 23041 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podUID="fa53a15c-ef65-4753-a29f-53894c55f42f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 00:54:07.304842 master-0 kubenswrapper[23041]: I0308 00:54:07.300274 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podUID="fa53a15c-ef65-4753-a29f-53894c55f42f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 00:54:08.224179 master-0 kubenswrapper[23041]: I0308 00:54:08.224100 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" podUID="fa53a15c-ef65-4753-a29f-53894c55f42f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 00:54:10.946614 master-0 kubenswrapper[23041]: I0308 00:54:10.945730 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:10.947249 master-0 kubenswrapper[23041]: I0308 00:54:10.946872 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-external-api-0" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-log" containerID="cri-o://fb23bcb3de29cd56ece6d3cdc15a6d92669f6ab087162b1b8b2aa35516d08ebc" gracePeriod=30 Mar 08 00:54:10.947651 master-0 kubenswrapper[23041]: I0308 00:54:10.947446 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-external-api-0" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-httpd" containerID="cri-o://3024bab15b9ea083977d343416d78095bf57369694338550e0140f3e4baef939" gracePeriod=30 Mar 08 00:54:11.335795 master-0 kubenswrapper[23041]: I0308 00:54:11.335746 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45pj6" Mar 08 00:54:11.353840 master-0 kubenswrapper[23041]: I0308 00:54:11.353767 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-sksvm" event={"ID":"33d52628-a63b-48e6-ac86-d8df7b20a8e9","Type":"ContainerDied","Data":"6812e5c3e0e27f2ac34a265f93d9ec898ad60376561d3a003ef5ccc302e50d11"} Mar 08 00:54:11.353840 master-0 kubenswrapper[23041]: I0308 00:54:11.353830 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6812e5c3e0e27f2ac34a265f93d9ec898ad60376561d3a003ef5ccc302e50d11" Mar 08 00:54:11.357620 master-0 kubenswrapper[23041]: I0308 00:54:11.357530 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kg45w" event={"ID":"54db0dbf-0b0a-4fdd-9811-876d166896ea","Type":"ContainerDied","Data":"fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3"} Mar 08 00:54:11.357786 master-0 kubenswrapper[23041]: I0308 00:54:11.357637 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe346ed1d1e36e287c6ea41bdd79b1de35f438366aae51fdb4fc10dd0ae8ead3" Mar 08 00:54:11.361608 master-0 kubenswrapper[23041]: I0308 00:54:11.361549 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rztgz" event={"ID":"51f57766-45a4-4e34-918e-d730a59edecd","Type":"ContainerDied","Data":"928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050"} Mar 08 00:54:11.361608 master-0 kubenswrapper[23041]: I0308 00:54:11.361606 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="928a4a1b3ce04cb06a3ed972cb6a9358fc47484efc02a9b795984de0a8451050" Mar 08 00:54:11.367806 master-0 kubenswrapper[23041]: I0308 00:54:11.367760 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:54:11.373279 master-0 kubenswrapper[23041]: I0308 00:54:11.370580 23041 generic.go:334] "Generic (PLEG): container finished" podID="c631af1a-025f-4c65-b202-678d31efbc2d" containerID="fb23bcb3de29cd56ece6d3cdc15a6d92669f6ab087162b1b8b2aa35516d08ebc" exitCode=143 Mar 08 00:54:11.373279 master-0 kubenswrapper[23041]: I0308 00:54:11.370712 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerDied","Data":"fb23bcb3de29cd56ece6d3cdc15a6d92669f6ab087162b1b8b2aa35516d08ebc"} Mar 08 00:54:11.373603 master-0 kubenswrapper[23041]: I0308 00:54:11.373445 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" event={"ID":"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524","Type":"ContainerDied","Data":"e3db896314ad04b3e55b47c98cd9748581a65d483cbea8a2fa43c89f181e4021"} Mar 08 00:54:11.373603 master-0 kubenswrapper[23041]: I0308 00:54:11.373484 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3db896314ad04b3e55b47c98cd9748581a65d483cbea8a2fa43c89f181e4021" Mar 08 00:54:11.375271 master-0 kubenswrapper[23041]: I0308 00:54:11.375126 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:54:11.375537 master-0 kubenswrapper[23041]: I0308 00:54:11.375487 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da7e-account-create-update-6k64t" event={"ID":"cf3fafa4-62ca-4b35-bd19-00651b4a9d48","Type":"ContainerDied","Data":"87ba0c745bb2c515c82c84abb4a2f652aca9b78eb1267dcd699c4cae92b7a55f"} Mar 08 00:54:11.375537 master-0 kubenswrapper[23041]: I0308 00:54:11.375531 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ba0c745bb2c515c82c84abb4a2f652aca9b78eb1267dcd699c4cae92b7a55f" Mar 08 00:54:11.378373 master-0 kubenswrapper[23041]: I0308 00:54:11.378312 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" event={"ID":"7c8614ce-5cee-41f4-a083-9e7d67b46633","Type":"ContainerDied","Data":"57922bcc7c8c1b6d804a056d53e80b8e92b2b111f9226bf0d0ab16638eb0d41f"} Mar 08 00:54:11.378489 master-0 kubenswrapper[23041]: I0308 00:54:11.378379 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57922bcc7c8c1b6d804a056d53e80b8e92b2b111f9226bf0d0ab16638eb0d41f" Mar 08 00:54:11.381228 master-0 kubenswrapper[23041]: I0308 00:54:11.381168 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45pj6" event={"ID":"9b782b43-8b7d-487d-9ded-698b28da172d","Type":"ContainerDied","Data":"01fd064e5eeaa379bcee8a70638149537f90db4bf4d3e8b4bbf1a531d9e18cd1"} Mar 08 00:54:11.381376 master-0 kubenswrapper[23041]: I0308 00:54:11.381233 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01fd064e5eeaa379bcee8a70638149537f90db4bf4d3e8b4bbf1a531d9e18cd1" Mar 08 00:54:11.381376 master-0 kubenswrapper[23041]: I0308 00:54:11.381313 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45pj6" Mar 08 00:54:11.392241 master-0 kubenswrapper[23041]: I0308 00:54:11.390812 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:54:11.441153 master-0 kubenswrapper[23041]: I0308 00:54:11.441077 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441485 master-0 kubenswrapper[23041]: I0308 00:54:11.441284 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441485 master-0 kubenswrapper[23041]: I0308 00:54:11.441369 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441485 master-0 kubenswrapper[23041]: I0308 00:54:11.441420 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441485 master-0 kubenswrapper[23041]: I0308 00:54:11.441445 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441485 master-0 kubenswrapper[23041]: I0308 00:54:11.441470 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts\") pod \"9b782b43-8b7d-487d-9ded-698b28da172d\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " Mar 08 00:54:11.441705 master-0 kubenswrapper[23041]: I0308 00:54:11.441552 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f8bj\" (UniqueName: \"kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj\") pod \"9b782b43-8b7d-487d-9ded-698b28da172d\" (UID: \"9b782b43-8b7d-487d-9ded-698b28da172d\") " Mar 08 00:54:11.441705 master-0 kubenswrapper[23041]: I0308 00:54:11.441595 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.441705 master-0 kubenswrapper[23041]: I0308 00:54:11.441639 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zg9z\" (UniqueName: \"kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.445594 master-0 kubenswrapper[23041]: I0308 00:54:11.445552 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:11.445820 master-0 kubenswrapper[23041]: I0308 00:54:11.445752 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:11.446169 master-0 kubenswrapper[23041]: I0308 00:54:11.446023 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts" (OuterVolumeSpecName: "scripts") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:11.450755 master-0 kubenswrapper[23041]: I0308 00:54:11.449364 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b782b43-8b7d-487d-9ded-698b28da172d" (UID: "9b782b43-8b7d-487d-9ded-698b28da172d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.459459 master-0 kubenswrapper[23041]: I0308 00:54:11.459394 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:54:11.469986 master-0 kubenswrapper[23041]: I0308 00:54:11.469909 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z" (OuterVolumeSpecName: "kube-api-access-7zg9z") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "kube-api-access-7zg9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.470375 master-0 kubenswrapper[23041]: I0308 00:54:11.470093 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj" (OuterVolumeSpecName: "kube-api-access-2f8bj") pod "9b782b43-8b7d-487d-9ded-698b28da172d" (UID: "9b782b43-8b7d-487d-9ded-698b28da172d"). InnerVolumeSpecName "kube-api-access-2f8bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.481843 master-0 kubenswrapper[23041]: I0308 00:54:11.481772 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 00:54:11.486920 master-0 kubenswrapper[23041]: I0308 00:54:11.486859 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:54:11.517962 master-0 kubenswrapper[23041]: I0308 00:54:11.517833 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:54:11.523135 master-0 kubenswrapper[23041]: I0308 00:54:11.523045 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config" (OuterVolumeSpecName: "config") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:11.551922 master-0 kubenswrapper[23041]: I0308 00:54:11.551846 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:11.553301 master-0 kubenswrapper[23041]: I0308 00:54:11.552892 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts\") pod \"54db0dbf-0b0a-4fdd-9811-876d166896ea\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " Mar 08 00:54:11.553301 master-0 kubenswrapper[23041]: I0308 00:54:11.552981 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22vtf\" (UniqueName: \"kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf\") pod \"7c8614ce-5cee-41f4-a083-9e7d67b46633\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " Mar 08 00:54:11.553301 master-0 kubenswrapper[23041]: I0308 00:54:11.553032 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmzdr\" (UniqueName: \"kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr\") pod \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " Mar 08 00:54:11.553301 master-0 kubenswrapper[23041]: I0308 00:54:11.553098 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") pod \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\" (UID: \"33d52628-a63b-48e6-ac86-d8df7b20a8e9\") " Mar 08 00:54:11.554713 master-0 kubenswrapper[23041]: I0308 00:54:11.553724 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54db0dbf-0b0a-4fdd-9811-876d166896ea" (UID: "54db0dbf-0b0a-4fdd-9811-876d166896ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.554713 master-0 kubenswrapper[23041]: W0308 00:54:11.553782 23041 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/33d52628-a63b-48e6-ac86-d8df7b20a8e9/volumes/kubernetes.io~secret/combined-ca-bundle Mar 08 00:54:11.554713 master-0 kubenswrapper[23041]: I0308 00:54:11.553820 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33d52628-a63b-48e6-ac86-d8df7b20a8e9" (UID: "33d52628-a63b-48e6-ac86-d8df7b20a8e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:11.555952 master-0 kubenswrapper[23041]: I0308 00:54:11.555435 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklhp\" (UniqueName: \"kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp\") pod \"54db0dbf-0b0a-4fdd-9811-876d166896ea\" (UID: \"54db0dbf-0b0a-4fdd-9811-876d166896ea\") " Mar 08 00:54:11.555952 master-0 kubenswrapper[23041]: I0308 00:54:11.555504 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts\") pod \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\" (UID: \"cf3fafa4-62ca-4b35-bd19-00651b4a9d48\") " Mar 08 00:54:11.555952 master-0 kubenswrapper[23041]: I0308 00:54:11.555600 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts\") pod \"7c8614ce-5cee-41f4-a083-9e7d67b46633\" (UID: \"7c8614ce-5cee-41f4-a083-9e7d67b46633\") " Mar 08 00:54:11.556573 master-0 kubenswrapper[23041]: I0308 00:54:11.556508 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf" (OuterVolumeSpecName: "kube-api-access-22vtf") pod "7c8614ce-5cee-41f4-a083-9e7d67b46633" (UID: "7c8614ce-5cee-41f4-a083-9e7d67b46633"). InnerVolumeSpecName "kube-api-access-22vtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.559521 master-0 kubenswrapper[23041]: I0308 00:54:11.559457 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr" (OuterVolumeSpecName: "kube-api-access-dmzdr") pod "cf3fafa4-62ca-4b35-bd19-00651b4a9d48" (UID: "cf3fafa4-62ca-4b35-bd19-00651b4a9d48"). InnerVolumeSpecName "kube-api-access-dmzdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.559521 master-0 kubenswrapper[23041]: I0308 00:54:11.559465 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf3fafa4-62ca-4b35-bd19-00651b4a9d48" (UID: "cf3fafa4-62ca-4b35-bd19-00651b4a9d48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.559964 master-0 kubenswrapper[23041]: I0308 00:54:11.559904 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c8614ce-5cee-41f4-a083-9e7d67b46633" (UID: "7c8614ce-5cee-41f4-a083-9e7d67b46633"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.560863 master-0 kubenswrapper[23041]: I0308 00:54:11.560823 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22vtf\" (UniqueName: \"kubernetes.io/projected/7c8614ce-5cee-41f4-a083-9e7d67b46633-kube-api-access-22vtf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561019 master-0 kubenswrapper[23041]: I0308 00:54:11.560952 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmzdr\" (UniqueName: \"kubernetes.io/projected/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-kube-api-access-dmzdr\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561019 master-0 kubenswrapper[23041]: I0308 00:54:11.560972 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561019 master-0 kubenswrapper[23041]: I0308 00:54:11.560987 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561223 master-0 kubenswrapper[23041]: I0308 00:54:11.561001 23041 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33d52628-a63b-48e6-ac86-d8df7b20a8e9-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561378 master-0 kubenswrapper[23041]: I0308 00:54:11.561172 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561378 master-0 kubenswrapper[23041]: I0308 00:54:11.561315 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/33d52628-a63b-48e6-ac86-d8df7b20a8e9-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561378 master-0 kubenswrapper[23041]: I0308 00:54:11.561329 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b782b43-8b7d-487d-9ded-698b28da172d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561378 master-0 kubenswrapper[23041]: I0308 00:54:11.561342 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f8bj\" (UniqueName: \"kubernetes.io/projected/9b782b43-8b7d-487d-9ded-698b28da172d-kube-api-access-2f8bj\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561378 master-0 kubenswrapper[23041]: I0308 00:54:11.561356 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf3fafa4-62ca-4b35-bd19-00651b4a9d48-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561649 master-0 kubenswrapper[23041]: I0308 00:54:11.561580 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/33d52628-a63b-48e6-ac86-d8df7b20a8e9-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561649 master-0 kubenswrapper[23041]: I0308 00:54:11.561599 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zg9z\" (UniqueName: \"kubernetes.io/projected/33d52628-a63b-48e6-ac86-d8df7b20a8e9-kube-api-access-7zg9z\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561649 master-0 kubenswrapper[23041]: I0308 00:54:11.561610 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8614ce-5cee-41f4-a083-9e7d67b46633-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.561649 master-0 kubenswrapper[23041]: I0308 00:54:11.561622 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54db0dbf-0b0a-4fdd-9811-876d166896ea-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.570452 master-0 kubenswrapper[23041]: I0308 00:54:11.563309 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp" (OuterVolumeSpecName: "kube-api-access-qklhp") pod "54db0dbf-0b0a-4fdd-9811-876d166896ea" (UID: "54db0dbf-0b0a-4fdd-9811-876d166896ea"). InnerVolumeSpecName "kube-api-access-qklhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.663418 master-0 kubenswrapper[23041]: I0308 00:54:11.663350 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2\") pod \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " Mar 08 00:54:11.663418 master-0 kubenswrapper[23041]: I0308 00:54:11.663432 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c44z\" (UniqueName: \"kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z\") pod \"51f57766-45a4-4e34-918e-d730a59edecd\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " Mar 08 00:54:11.663956 master-0 kubenswrapper[23041]: I0308 00:54:11.663483 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts\") pod \"51f57766-45a4-4e34-918e-d730a59edecd\" (UID: \"51f57766-45a4-4e34-918e-d730a59edecd\") " Mar 08 00:54:11.663956 master-0 kubenswrapper[23041]: I0308 00:54:11.663540 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts\") pod \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\" (UID: \"2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524\") " Mar 08 00:54:11.664338 master-0 kubenswrapper[23041]: I0308 00:54:11.664267 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklhp\" (UniqueName: \"kubernetes.io/projected/54db0dbf-0b0a-4fdd-9811-876d166896ea-kube-api-access-qklhp\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.664861 master-0 kubenswrapper[23041]: I0308 00:54:11.664823 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" (UID: "2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.664962 master-0 kubenswrapper[23041]: I0308 00:54:11.664913 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51f57766-45a4-4e34-918e-d730a59edecd" (UID: "51f57766-45a4-4e34-918e-d730a59edecd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:11.666760 master-0 kubenswrapper[23041]: I0308 00:54:11.666714 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2" (OuterVolumeSpecName: "kube-api-access-5tkb2") pod "2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" (UID: "2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524"). InnerVolumeSpecName "kube-api-access-5tkb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.669322 master-0 kubenswrapper[23041]: I0308 00:54:11.669172 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z" (OuterVolumeSpecName: "kube-api-access-6c44z") pod "51f57766-45a4-4e34-918e-d730a59edecd" (UID: "51f57766-45a4-4e34-918e-d730a59edecd"). InnerVolumeSpecName "kube-api-access-6c44z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:11.766693 master-0 kubenswrapper[23041]: I0308 00:54:11.766619 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-kube-api-access-5tkb2\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.766693 master-0 kubenswrapper[23041]: I0308 00:54:11.766684 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c44z\" (UniqueName: \"kubernetes.io/projected/51f57766-45a4-4e34-918e-d730a59edecd-kube-api-access-6c44z\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.766693 master-0 kubenswrapper[23041]: I0308 00:54:11.766701 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f57766-45a4-4e34-918e-d730a59edecd-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.767032 master-0 kubenswrapper[23041]: I0308 00:54:11.766718 23041 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:11.936182 master-0 kubenswrapper[23041]: I0308 00:54:11.936053 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:11.936812 master-0 kubenswrapper[23041]: I0308 00:54:11.936704 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-internal-api-0" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-log" containerID="cri-o://952aad800122f0c5297b7769b87af95cfeabc1e5b270679a6f4cf94801ac2b3f" gracePeriod=30 Mar 08 00:54:11.936920 master-0 kubenswrapper[23041]: I0308 00:54:11.936729 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-1280f-default-internal-api-0" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-httpd" containerID="cri-o://052793deb444ddfa766fae4a335f25279de72c0d02a2316a2e213e56bed30759" gracePeriod=30 Mar 08 00:54:12.287777 master-0 kubenswrapper[23041]: I0308 00:54:12.287636 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:54:12.289223 master-0 kubenswrapper[23041]: I0308 00:54:12.289006 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-856bf8b6f6-t9lvl" Mar 08 00:54:12.423586 master-0 kubenswrapper[23041]: I0308 00:54:12.423521 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7de414ad-fdb9-4bfb-a953-439cbf3fb81e","Type":"ContainerStarted","Data":"8ec1470ce184a70d9e2d06c294f014f8e29e9a0d53b4b76d2e51c924b2b53d7e"} Mar 08 00:54:12.440946 master-0 kubenswrapper[23041]: I0308 00:54:12.440851 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"aa121cb8dfbc38b21bce954d5ae242477fe582d22b9dfb0358c3bd4674afb90b"} Mar 08 00:54:12.479221 master-0 kubenswrapper[23041]: I0308 00:54:12.479150 23041 generic.go:334] "Generic (PLEG): container finished" podID="c81e602d-26e5-49ac-92d1-71fea2607868" containerID="952aad800122f0c5297b7769b87af95cfeabc1e5b270679a6f4cf94801ac2b3f" exitCode=143 Mar 08 00:54:12.480130 master-0 kubenswrapper[23041]: I0308 00:54:12.480108 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91a3-account-create-update-jwmlg" Mar 08 00:54:12.491262 master-0 kubenswrapper[23041]: I0308 00:54:12.480336 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerDied","Data":"952aad800122f0c5297b7769b87af95cfeabc1e5b270679a6f4cf94801ac2b3f"} Mar 08 00:54:12.491262 master-0 kubenswrapper[23041]: I0308 00:54:12.480456 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-sksvm" Mar 08 00:54:12.492190 master-0 kubenswrapper[23041]: I0308 00:54:12.485462 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rztgz" Mar 08 00:54:12.497865 master-0 kubenswrapper[23041]: I0308 00:54:12.485510 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-40fc-account-create-update-hfwpl" Mar 08 00:54:12.499737 master-0 kubenswrapper[23041]: I0308 00:54:12.485540 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da7e-account-create-update-6k64t" Mar 08 00:54:12.503035 master-0 kubenswrapper[23041]: I0308 00:54:12.485588 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kg45w" Mar 08 00:54:12.589246 master-0 kubenswrapper[23041]: I0308 00:54:12.586616 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.324367238 podStartE2EDuration="27.586593298s" podCreationTimestamp="2026-03-08 00:53:45 +0000 UTC" firstStartedPulling="2026-03-08 00:53:46.836042387 +0000 UTC m=+1332.308878931" lastFinishedPulling="2026-03-08 00:54:11.098268447 +0000 UTC m=+1356.571104991" observedRunningTime="2026-03-08 00:54:12.457422032 +0000 UTC m=+1357.930258606" watchObservedRunningTime="2026-03-08 00:54:12.586593298 +0000 UTC m=+1358.059429852" Mar 08 00:54:12.810776 master-0 kubenswrapper[23041]: I0308 00:54:12.810646 23041 scope.go:117] "RemoveContainer" containerID="4792fdf65f1907ff7e2565afb3a964e9ef62317dd71687a74905d3610b602a64" Mar 08 00:54:13.492427 master-0 kubenswrapper[23041]: I0308 00:54:13.492289 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" event={"ID":"a94dba9c-1e25-42ed-b30a-d278979d1de9","Type":"ContainerStarted","Data":"f4bcaa21d867603e5501b47db25ae76442c636dc119f44d4e9a88ea4eda273f5"} Mar 08 00:54:13.493036 master-0 kubenswrapper[23041]: I0308 00:54:13.492523 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.646799 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wcsvr"] Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647261 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api-log" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647275 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api-log" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647293 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647300 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647315 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3fafa4-62ca-4b35-bd19-00651b4a9d48" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647325 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3fafa4-62ca-4b35-bd19-00651b4a9d48" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647337 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="init" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647344 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="init" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647380 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d52628-a63b-48e6-ac86-d8df7b20a8e9" containerName="ironic-inspector-db-sync" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647388 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d52628-a63b-48e6-ac86-d8df7b20a8e9" containerName="ironic-inspector-db-sync" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647431 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b782b43-8b7d-487d-9ded-698b28da172d" containerName="mariadb-database-create" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647439 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b782b43-8b7d-487d-9ded-698b28da172d" containerName="mariadb-database-create" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647454 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8614ce-5cee-41f4-a083-9e7d67b46633" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647462 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8614ce-5cee-41f4-a083-9e7d67b46633" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647475 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647481 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" containerName="mariadb-account-create-update" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647501 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-httpd" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: I0308 00:54:13.647507 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-httpd" Mar 08 00:54:13.647576 master-0 kubenswrapper[23041]: E0308 00:54:13.647516 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-api" Mar 08 00:54:13.648660 master-0 kubenswrapper[23041]: I0308 00:54:13.647521 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-api" Mar 08 00:54:13.648660 master-0 kubenswrapper[23041]: E0308 00:54:13.648549 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54db0dbf-0b0a-4fdd-9811-876d166896ea" containerName="mariadb-database-create" Mar 08 00:54:13.648660 master-0 kubenswrapper[23041]: I0308 00:54:13.648567 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="54db0dbf-0b0a-4fdd-9811-876d166896ea" containerName="mariadb-database-create" Mar 08 00:54:13.648660 master-0 kubenswrapper[23041]: E0308 00:54:13.648592 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f57766-45a4-4e34-918e-d730a59edecd" containerName="mariadb-database-create" Mar 08 00:54:13.648660 master-0 kubenswrapper[23041]: I0308 00:54:13.648598 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f57766-45a4-4e34-918e-d730a59edecd" containerName="mariadb-database-create" Mar 08 00:54:13.648884 master-0 kubenswrapper[23041]: I0308 00:54:13.648823 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8614ce-5cee-41f4-a083-9e7d67b46633" containerName="mariadb-account-create-update" Mar 08 00:54:13.648884 master-0 kubenswrapper[23041]: I0308 00:54:13.648841 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api-log" Mar 08 00:54:13.648884 master-0 kubenswrapper[23041]: I0308 00:54:13.648852 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b782b43-8b7d-487d-9ded-698b28da172d" containerName="mariadb-database-create" Mar 08 00:54:13.648884 master-0 kubenswrapper[23041]: I0308 00:54:13.648877 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:13.648884 master-0 kubenswrapper[23041]: I0308 00:54:13.648886 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f57766-45a4-4e34-918e-d730a59edecd" containerName="mariadb-database-create" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648898 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d52628-a63b-48e6-ac86-d8df7b20a8e9" containerName="ironic-inspector-db-sync" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648909 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648922 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" containerName="mariadb-account-create-update" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648932 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-api" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648948 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="54db0dbf-0b0a-4fdd-9811-876d166896ea" containerName="mariadb-database-create" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648954 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3fafa4-62ca-4b35-bd19-00651b4a9d48" containerName="mariadb-account-create-update" Mar 08 00:54:13.649115 master-0 kubenswrapper[23041]: I0308 00:54:13.648968 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c3f888-4804-49d5-859a-73983e7c5414" containerName="neutron-httpd" Mar 08 00:54:13.649666 master-0 kubenswrapper[23041]: I0308 00:54:13.649633 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.657005 master-0 kubenswrapper[23041]: I0308 00:54:13.656954 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 00:54:13.658553 master-0 kubenswrapper[23041]: I0308 00:54:13.658521 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 00:54:13.668710 master-0 kubenswrapper[23041]: I0308 00:54:13.668654 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wcsvr"] Mar 08 00:54:13.731695 master-0 kubenswrapper[23041]: I0308 00:54:13.731609 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbhf\" (UniqueName: \"kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.732417 master-0 kubenswrapper[23041]: I0308 00:54:13.732336 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.732754 master-0 kubenswrapper[23041]: I0308 00:54:13.732732 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.732957 master-0 kubenswrapper[23041]: I0308 00:54:13.732918 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.835577 master-0 kubenswrapper[23041]: I0308 00:54:13.835493 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.835577 master-0 kubenswrapper[23041]: I0308 00:54:13.835571 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.836024 master-0 kubenswrapper[23041]: I0308 00:54:13.835661 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbhf\" (UniqueName: \"kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.836024 master-0 kubenswrapper[23041]: I0308 00:54:13.835779 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.848351 master-0 kubenswrapper[23041]: I0308 00:54:13.839524 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.848351 master-0 kubenswrapper[23041]: I0308 00:54:13.840174 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.848351 master-0 kubenswrapper[23041]: I0308 00:54:13.845913 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.859294 master-0 kubenswrapper[23041]: I0308 00:54:13.859248 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbhf\" (UniqueName: \"kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf\") pod \"nova-cell0-conductor-db-sync-wcsvr\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:13.969118 master-0 kubenswrapper[23041]: I0308 00:54:13.969058 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:14.542517 master-0 kubenswrapper[23041]: I0308 00:54:14.542447 23041 generic.go:334] "Generic (PLEG): container finished" podID="c631af1a-025f-4c65-b202-678d31efbc2d" containerID="3024bab15b9ea083977d343416d78095bf57369694338550e0140f3e4baef939" exitCode=0 Mar 08 00:54:14.543481 master-0 kubenswrapper[23041]: I0308 00:54:14.543337 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerDied","Data":"3024bab15b9ea083977d343416d78095bf57369694338550e0140f3e4baef939"} Mar 08 00:54:14.658408 master-0 kubenswrapper[23041]: I0308 00:54:14.656808 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wcsvr"] Mar 08 00:54:14.708486 master-0 kubenswrapper[23041]: I0308 00:54:14.704784 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:54:14.708486 master-0 kubenswrapper[23041]: E0308 00:54:14.706294 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:14.708486 master-0 kubenswrapper[23041]: I0308 00:54:14.706317 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7cb75-9466-47eb-bd3a-da17df2b5c2a" containerName="ironic-api" Mar 08 00:54:14.710192 master-0 kubenswrapper[23041]: I0308 00:54:14.708637 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.784367 master-0 kubenswrapper[23041]: I0308 00:54:14.783684 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:54:14.855261 master-0 kubenswrapper[23041]: I0308 00:54:14.855178 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:14.861177 master-0 kubenswrapper[23041]: I0308 00:54:14.861120 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:14.870516 master-0 kubenswrapper[23041]: I0308 00:54:14.870469 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 00:54:14.870710 master-0 kubenswrapper[23041]: I0308 00:54:14.870557 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 00:54:14.870710 master-0 kubenswrapper[23041]: I0308 00:54:14.870485 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 00:54:14.873760 master-0 kubenswrapper[23041]: I0308 00:54:14.873642 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.873760 master-0 kubenswrapper[23041]: I0308 00:54:14.873699 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.873760 master-0 kubenswrapper[23041]: I0308 00:54:14.873752 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnsz\" (UniqueName: \"kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.874529 master-0 kubenswrapper[23041]: I0308 00:54:14.873770 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.874529 master-0 kubenswrapper[23041]: I0308 00:54:14.873823 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.879018 master-0 kubenswrapper[23041]: I0308 00:54:14.878909 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.895715 master-0 kubenswrapper[23041]: I0308 00:54:14.895635 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:14.930829 master-0 kubenswrapper[23041]: I0308 00:54:14.928087 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:14.990182 master-0 kubenswrapper[23041]: I0308 00:54:14.990037 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:14.990182 master-0 kubenswrapper[23041]: I0308 00:54:14.990116 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:14.990423 master-0 kubenswrapper[23041]: I0308 00:54:14.990257 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:14.990423 master-0 kubenswrapper[23041]: I0308 00:54:14.990306 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jpm5\" (UniqueName: \"kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:14.990537 master-0 kubenswrapper[23041]: I0308 00:54:14.990508 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:14.990755 master-0 kubenswrapper[23041]: I0308 00:54:14.990730 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.990804 master-0 kubenswrapper[23041]: I0308 00:54:14.990788 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.990840 master-0 kubenswrapper[23041]: I0308 00:54:14.990824 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.991291 master-0 kubenswrapper[23041]: I0308 00:54:14.991266 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnsz\" (UniqueName: \"kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.991360 master-0 kubenswrapper[23041]: I0308 00:54:14.991305 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.995338 master-0 kubenswrapper[23041]: I0308 00:54:14.992427 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.995338 master-0 kubenswrapper[23041]: I0308 00:54:14.993524 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.995338 master-0 kubenswrapper[23041]: I0308 00:54:14.994548 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.996513 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.997341 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.997480 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.997523 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.997605 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4r5c\" (UniqueName: \"kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.997859 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.998256 master-0 kubenswrapper[23041]: I0308 00:54:14.998089 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:14.998710 master-0 kubenswrapper[23041]: I0308 00:54:14.998276 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.998710 master-0 kubenswrapper[23041]: I0308 00:54:14.998316 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:14.998710 master-0 kubenswrapper[23041]: I0308 00:54:14.998447 23041 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:14.998710 master-0 kubenswrapper[23041]: I0308 00:54:14.998681 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:15.017742 master-0 kubenswrapper[23041]: I0308 00:54:15.008893 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:15.026944 master-0 kubenswrapper[23041]: I0308 00:54:15.026869 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnsz\" (UniqueName: \"kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz\") pod \"dnsmasq-dns-77475956d7-pp5mp\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:15.028056 master-0 kubenswrapper[23041]: I0308 00:54:15.028014 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5" (OuterVolumeSpecName: "kube-api-access-8jpm5") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "kube-api-access-8jpm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:15.032735 master-0 kubenswrapper[23041]: I0308 00:54:15.030764 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf" (OuterVolumeSpecName: "glance") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:54:15.045223 master-0 kubenswrapper[23041]: I0308 00:54:15.045155 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:15.084895 master-0 kubenswrapper[23041]: I0308 00:54:15.081296 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:15.099785 master-0 kubenswrapper[23041]: I0308 00:54:15.099724 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:15.100046 master-0 kubenswrapper[23041]: I0308 00:54:15.100030 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:15.100416 master-0 kubenswrapper[23041]: I0308 00:54:15.100402 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts\") pod \"c631af1a-025f-4c65-b202-678d31efbc2d\" (UID: \"c631af1a-025f-4c65-b202-678d31efbc2d\") " Mar 08 00:54:15.102689 master-0 kubenswrapper[23041]: I0308 00:54:15.102539 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:15.102689 master-0 kubenswrapper[23041]: I0308 00:54:15.102627 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.102876 master-0 kubenswrapper[23041]: I0308 00:54:15.102853 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.109938 master-0 kubenswrapper[23041]: I0308 00:54:15.109448 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.109938 master-0 kubenswrapper[23041]: I0308 00:54:15.109500 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.109938 master-0 kubenswrapper[23041]: I0308 00:54:15.109558 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts" (OuterVolumeSpecName: "scripts") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:15.109938 master-0 kubenswrapper[23041]: I0308 00:54:15.109627 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.110996 master-0 kubenswrapper[23041]: I0308 00:54:15.109731 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs" (OuterVolumeSpecName: "logs") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:15.110996 master-0 kubenswrapper[23041]: I0308 00:54:15.109818 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.110996 master-0 kubenswrapper[23041]: I0308 00:54:15.110894 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.110996 master-0 kubenswrapper[23041]: I0308 00:54:15.110951 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4r5c\" (UniqueName: \"kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.111149 master-0 kubenswrapper[23041]: I0308 00:54:15.111089 23041 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.111149 master-0 kubenswrapper[23041]: I0308 00:54:15.111102 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c631af1a-025f-4c65-b202-678d31efbc2d-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.111149 master-0 kubenswrapper[23041]: I0308 00:54:15.111111 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.111902 master-0 kubenswrapper[23041]: I0308 00:54:15.111839 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.112848 master-0 kubenswrapper[23041]: I0308 00:54:15.112808 23041 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") on node \"master-0\" " Mar 08 00:54:15.112848 master-0 kubenswrapper[23041]: I0308 00:54:15.112834 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.112969 master-0 kubenswrapper[23041]: I0308 00:54:15.112856 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jpm5\" (UniqueName: \"kubernetes.io/projected/c631af1a-025f-4c65-b202-678d31efbc2d-kube-api-access-8jpm5\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.120052 master-0 kubenswrapper[23041]: I0308 00:54:15.119996 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.120934 master-0 kubenswrapper[23041]: I0308 00:54:15.120884 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.121784 master-0 kubenswrapper[23041]: I0308 00:54:15.121763 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.128077 master-0 kubenswrapper[23041]: I0308 00:54:15.127996 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.148470 master-0 kubenswrapper[23041]: I0308 00:54:15.143867 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4r5c\" (UniqueName: \"kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c\") pod \"ironic-inspector-0\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:15.198667 master-0 kubenswrapper[23041]: I0308 00:54:15.198463 23041 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 00:54:15.198667 master-0 kubenswrapper[23041]: I0308 00:54:15.198626 23041 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223" (UniqueName: "kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf") on node "master-0" Mar 08 00:54:15.215642 master-0 kubenswrapper[23041]: I0308 00:54:15.214994 23041 reconciler_common.go:293] "Volume detached for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.264775 master-0 kubenswrapper[23041]: I0308 00:54:15.264578 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:15.309491 master-0 kubenswrapper[23041]: I0308 00:54:15.309275 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data" (OuterVolumeSpecName: "config-data") pod "c631af1a-025f-4c65-b202-678d31efbc2d" (UID: "c631af1a-025f-4c65-b202-678d31efbc2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:15.323862 master-0 kubenswrapper[23041]: I0308 00:54:15.323814 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c631af1a-025f-4c65-b202-678d31efbc2d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:15.688638 master-0 kubenswrapper[23041]: I0308 00:54:15.688579 23041 generic.go:334] "Generic (PLEG): container finished" podID="c81e602d-26e5-49ac-92d1-71fea2607868" containerID="052793deb444ddfa766fae4a335f25279de72c0d02a2316a2e213e56bed30759" exitCode=0 Mar 08 00:54:15.690854 master-0 kubenswrapper[23041]: I0308 00:54:15.688674 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerDied","Data":"052793deb444ddfa766fae4a335f25279de72c0d02a2316a2e213e56bed30759"} Mar 08 00:54:15.705478 master-0 kubenswrapper[23041]: I0308 00:54:15.699443 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.705478 master-0 kubenswrapper[23041]: I0308 00:54:15.699813 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"c631af1a-025f-4c65-b202-678d31efbc2d","Type":"ContainerDied","Data":"cf01c813d8dc7eaac943bccc02616c5295672f014238fbc58ab73966795d3f0e"} Mar 08 00:54:15.705478 master-0 kubenswrapper[23041]: I0308 00:54:15.699868 23041 scope.go:117] "RemoveContainer" containerID="3024bab15b9ea083977d343416d78095bf57369694338550e0140f3e4baef939" Mar 08 00:54:15.726523 master-0 kubenswrapper[23041]: I0308 00:54:15.726213 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" event={"ID":"65e42302-3592-46fd-b7a9-b125bf61382b","Type":"ContainerStarted","Data":"bdf9a866d900eeef92a31437989e83717a6864f53eeafc41bf5117510aa72a01"} Mar 08 00:54:15.736680 master-0 kubenswrapper[23041]: I0308 00:54:15.736562 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:54:15.748158 master-0 kubenswrapper[23041]: I0308 00:54:15.748115 23041 scope.go:117] "RemoveContainer" containerID="fb23bcb3de29cd56ece6d3cdc15a6d92669f6ab087162b1b8b2aa35516d08ebc" Mar 08 00:54:15.824494 master-0 kubenswrapper[23041]: I0308 00:54:15.814171 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:15.830937 master-0 kubenswrapper[23041]: I0308 00:54:15.830631 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:15.865573 master-0 kubenswrapper[23041]: I0308 00:54:15.864280 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:15.865573 master-0 kubenswrapper[23041]: E0308 00:54:15.865145 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-log" Mar 08 00:54:15.865573 master-0 kubenswrapper[23041]: I0308 00:54:15.865164 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-log" Mar 08 00:54:15.865573 master-0 kubenswrapper[23041]: E0308 00:54:15.865263 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-httpd" Mar 08 00:54:15.865573 master-0 kubenswrapper[23041]: I0308 00:54:15.865271 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-httpd" Mar 08 00:54:15.865911 master-0 kubenswrapper[23041]: I0308 00:54:15.865630 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-log" Mar 08 00:54:15.865911 master-0 kubenswrapper[23041]: I0308 00:54:15.865657 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" containerName="glance-httpd" Mar 08 00:54:15.872471 master-0 kubenswrapper[23041]: I0308 00:54:15.871589 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.884582 master-0 kubenswrapper[23041]: I0308 00:54:15.875105 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 00:54:15.884582 master-0 kubenswrapper[23041]: I0308 00:54:15.875428 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-external-config-data" Mar 08 00:54:15.884582 master-0 kubenswrapper[23041]: I0308 00:54:15.879845 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.974909 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.974995 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975144 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975168 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975243 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tsmn\" (UniqueName: \"kubernetes.io/projected/65244579-afd4-4973-8b2f-5568ba7974c4-kube-api-access-7tsmn\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975404 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975482 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:15.976683 master-0 kubenswrapper[23041]: I0308 00:54:15.975606 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.085337 master-0 kubenswrapper[23041]: I0308 00:54:16.085280 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.085337 master-0 kubenswrapper[23041]: I0308 00:54:16.085335 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.085596 master-0 kubenswrapper[23041]: I0308 00:54:16.085375 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tsmn\" (UniqueName: \"kubernetes.io/projected/65244579-afd4-4973-8b2f-5568ba7974c4-kube-api-access-7tsmn\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.085596 master-0 kubenswrapper[23041]: I0308 00:54:16.085444 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087335 master-0 kubenswrapper[23041]: I0308 00:54:16.086168 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-logs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087335 master-0 kubenswrapper[23041]: I0308 00:54:16.086724 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087335 master-0 kubenswrapper[23041]: I0308 00:54:16.086823 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087335 master-0 kubenswrapper[23041]: I0308 00:54:16.086891 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087335 master-0 kubenswrapper[23041]: I0308 00:54:16.086919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.087594 master-0 kubenswrapper[23041]: I0308 00:54:16.087479 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65244579-afd4-4973-8b2f-5568ba7974c4-httpd-run\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.094163 master-0 kubenswrapper[23041]: I0308 00:54:16.094116 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-config-data\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.095146 master-0 kubenswrapper[23041]: I0308 00:54:16.095104 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-combined-ca-bundle\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.101384 master-0 kubenswrapper[23041]: I0308 00:54:16.099514 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:54:16.101384 master-0 kubenswrapper[23041]: I0308 00:54:16.099560 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/fba811848206921d87eed675e9d53cf2e2311d13264fbd23c3492e9c4520fe29/globalmount\"" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.102343 master-0 kubenswrapper[23041]: I0308 00:54:16.102175 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-scripts\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.103464 master-0 kubenswrapper[23041]: I0308 00:54:16.103426 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65244579-afd4-4973-8b2f-5568ba7974c4-public-tls-certs\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.120240 master-0 kubenswrapper[23041]: I0308 00:54:16.120094 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tsmn\" (UniqueName: \"kubernetes.io/projected/65244579-afd4-4973-8b2f-5568ba7974c4-kube-api-access-7tsmn\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:16.154245 master-0 kubenswrapper[23041]: I0308 00:54:16.146661 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:16.212921 master-0 kubenswrapper[23041]: I0308 00:54:16.212126 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.398945 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.399283 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.399463 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9vh\" (UniqueName: \"kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.399525 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.399555 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.400000 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.400043 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.400105 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs\") pod \"c81e602d-26e5-49ac-92d1-71fea2607868\" (UID: \"c81e602d-26e5-49ac-92d1-71fea2607868\") " Mar 08 00:54:16.401751 master-0 kubenswrapper[23041]: I0308 00:54:16.401156 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:16.404893 master-0 kubenswrapper[23041]: I0308 00:54:16.401949 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs" (OuterVolumeSpecName: "logs") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:16.404893 master-0 kubenswrapper[23041]: I0308 00:54:16.403241 23041 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.404893 master-0 kubenswrapper[23041]: I0308 00:54:16.403268 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c81e602d-26e5-49ac-92d1-71fea2607868-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.406909 master-0 kubenswrapper[23041]: I0308 00:54:16.406546 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts" (OuterVolumeSpecName: "scripts") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:16.412654 master-0 kubenswrapper[23041]: I0308 00:54:16.412603 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh" (OuterVolumeSpecName: "kube-api-access-8v9vh") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "kube-api-access-8v9vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:16.449704 master-0 kubenswrapper[23041]: I0308 00:54:16.446569 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:16.487054 master-0 kubenswrapper[23041]: I0308 00:54:16.486954 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:16.506442 master-0 kubenswrapper[23041]: I0308 00:54:16.506380 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9vh\" (UniqueName: \"kubernetes.io/projected/c81e602d-26e5-49ac-92d1-71fea2607868-kube-api-access-8v9vh\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.506442 master-0 kubenswrapper[23041]: I0308 00:54:16.506439 23041 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.506761 master-0 kubenswrapper[23041]: I0308 00:54:16.506457 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.506761 master-0 kubenswrapper[23041]: I0308 00:54:16.506469 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.549466 master-0 kubenswrapper[23041]: I0308 00:54:16.549369 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data" (OuterVolumeSpecName: "config-data") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:16.609280 master-0 kubenswrapper[23041]: I0308 00:54:16.608249 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c81e602d-26e5-49ac-92d1-71fea2607868-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:16.748222 master-0 kubenswrapper[23041]: I0308 00:54:16.748137 23041 generic.go:334] "Generic (PLEG): container finished" podID="5fd31740-3478-41e5-8295-d4b50f40db04" containerID="aa121cb8dfbc38b21bce954d5ae242477fe582d22b9dfb0358c3bd4674afb90b" exitCode=0 Mar 08 00:54:16.748721 master-0 kubenswrapper[23041]: I0308 00:54:16.748264 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerDied","Data":"aa121cb8dfbc38b21bce954d5ae242477fe582d22b9dfb0358c3bd4674afb90b"} Mar 08 00:54:16.757994 master-0 kubenswrapper[23041]: I0308 00:54:16.756727 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"c81e602d-26e5-49ac-92d1-71fea2607868","Type":"ContainerDied","Data":"335191ee8e87e94345fbe2487bf145b34d18bafafb0ac0996779eacbdf40dcdc"} Mar 08 00:54:16.757994 master-0 kubenswrapper[23041]: I0308 00:54:16.756769 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:16.757994 master-0 kubenswrapper[23041]: I0308 00:54:16.756908 23041 scope.go:117] "RemoveContainer" containerID="052793deb444ddfa766fae4a335f25279de72c0d02a2316a2e213e56bed30759" Mar 08 00:54:16.770693 master-0 kubenswrapper[23041]: I0308 00:54:16.770011 23041 generic.go:334] "Generic (PLEG): container finished" podID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerID="351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01" exitCode=0 Mar 08 00:54:16.770693 master-0 kubenswrapper[23041]: I0308 00:54:16.770449 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"43a61c38-6179-4e19-9027-afc0efca9ea6","Type":"ContainerDied","Data":"351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01"} Mar 08 00:54:16.770693 master-0 kubenswrapper[23041]: I0308 00:54:16.770547 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"43a61c38-6179-4e19-9027-afc0efca9ea6","Type":"ContainerStarted","Data":"b9b1c3bef1019fe259fb9b996500033b8b035f452ead2834ce325428047a86c6"} Mar 08 00:54:16.774570 master-0 kubenswrapper[23041]: I0308 00:54:16.772942 23041 generic.go:334] "Generic (PLEG): container finished" podID="184ea98e-2eac-49e3-a090-c412857c1df4" containerID="70109aebc0714cb899cf3f07e744f45a361919fa1862191281a75cf168b540e2" exitCode=0 Mar 08 00:54:16.774570 master-0 kubenswrapper[23041]: I0308 00:54:16.772997 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" event={"ID":"184ea98e-2eac-49e3-a090-c412857c1df4","Type":"ContainerDied","Data":"70109aebc0714cb899cf3f07e744f45a361919fa1862191281a75cf168b540e2"} Mar 08 00:54:16.774570 master-0 kubenswrapper[23041]: I0308 00:54:16.773031 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" event={"ID":"184ea98e-2eac-49e3-a090-c412857c1df4","Type":"ContainerStarted","Data":"615f4dea517bb3e594bd2aedbe25561b6e130738bac71939d302794616d32ba2"} Mar 08 00:54:16.795508 master-0 kubenswrapper[23041]: I0308 00:54:16.792771 23041 scope.go:117] "RemoveContainer" containerID="952aad800122f0c5297b7769b87af95cfeabc1e5b270679a6f4cf94801ac2b3f" Mar 08 00:54:16.833940 master-0 kubenswrapper[23041]: I0308 00:54:16.833492 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c631af1a-025f-4c65-b202-678d31efbc2d" path="/var/lib/kubelet/pods/c631af1a-025f-4c65-b202-678d31efbc2d/volumes" Mar 08 00:54:17.004389 master-0 kubenswrapper[23041]: I0308 00:54:16.998923 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e" (OuterVolumeSpecName: "glance") pod "c81e602d-26e5-49ac-92d1-71fea2607868" (UID: "c81e602d-26e5-49ac-92d1-71fea2607868"). InnerVolumeSpecName "pvc-7512aa1f-2488-47af-b61f-945377082816". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:54:17.024720 master-0 kubenswrapper[23041]: I0308 00:54:17.021853 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-70c9925e-bbc2-47ea-836c-8b4fadf77223\" (UniqueName: \"kubernetes.io/csi/topolvm.io^60a4d41a-f098-4c1e-9a4b-472c7a2f79cf\") pod \"glance-1280f-default-external-api-0\" (UID: \"65244579-afd4-4973-8b2f-5568ba7974c4\") " pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:17.037193 master-0 kubenswrapper[23041]: I0308 00:54:17.036865 23041 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") on node \"master-0\" " Mar 08 00:54:17.071645 master-0 kubenswrapper[23041]: I0308 00:54:17.070376 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:17.093707 master-0 kubenswrapper[23041]: I0308 00:54:17.093496 23041 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 00:54:17.094892 master-0 kubenswrapper[23041]: I0308 00:54:17.094868 23041 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7512aa1f-2488-47af-b61f-945377082816" (UniqueName: "kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e") on node "master-0" Mar 08 00:54:17.144110 master-0 kubenswrapper[23041]: I0308 00:54:17.144066 23041 reconciler_common.go:293] "Volume detached for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:17.227318 master-0 kubenswrapper[23041]: I0308 00:54:17.221962 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:17.227318 master-0 kubenswrapper[23041]: I0308 00:54:17.226948 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:17.237875 master-0 kubenswrapper[23041]: I0308 00:54:17.237732 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:17.238449 master-0 kubenswrapper[23041]: E0308 00:54:17.238420 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-httpd" Mar 08 00:54:17.238449 master-0 kubenswrapper[23041]: I0308 00:54:17.238443 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-httpd" Mar 08 00:54:17.238537 master-0 kubenswrapper[23041]: E0308 00:54:17.238475 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-log" Mar 08 00:54:17.238537 master-0 kubenswrapper[23041]: I0308 00:54:17.238483 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-log" Mar 08 00:54:17.238778 master-0 kubenswrapper[23041]: I0308 00:54:17.238751 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-log" Mar 08 00:54:17.238820 master-0 kubenswrapper[23041]: I0308 00:54:17.238788 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" containerName="glance-httpd" Mar 08 00:54:17.243129 master-0 kubenswrapper[23041]: I0308 00:54:17.242357 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.258466 master-0 kubenswrapper[23041]: I0308 00:54:17.248221 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-1280f-default-internal-config-data" Mar 08 00:54:17.258466 master-0 kubenswrapper[23041]: I0308 00:54:17.249746 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 00:54:17.258466 master-0 kubenswrapper[23041]: I0308 00:54:17.253194 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:17.400366 master-0 kubenswrapper[23041]: I0308 00:54:17.372980 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shc2j\" (UniqueName: \"kubernetes.io/projected/e6694041-7676-4736-a6d7-1854137deeba-kube-api-access-shc2j\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.415488 master-0 kubenswrapper[23041]: I0308 00:54:17.415419 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.415896 master-0 kubenswrapper[23041]: I0308 00:54:17.415871 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.416029 master-0 kubenswrapper[23041]: I0308 00:54:17.416012 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.416106 master-0 kubenswrapper[23041]: I0308 00:54:17.416087 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.416146 master-0 kubenswrapper[23041]: I0308 00:54:17.416124 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.416285 master-0 kubenswrapper[23041]: I0308 00:54:17.416187 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.416285 master-0 kubenswrapper[23041]: I0308 00:54:17.416216 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.522908 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523003 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523041 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523063 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523104 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523125 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523184 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shc2j\" (UniqueName: \"kubernetes.io/projected/e6694041-7676-4736-a6d7-1854137deeba-kube-api-access-shc2j\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.523376 master-0 kubenswrapper[23041]: I0308 00:54:17.523216 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.530091 master-0 kubenswrapper[23041]: I0308 00:54:17.529790 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-httpd-run\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.547415 master-0 kubenswrapper[23041]: I0308 00:54:17.546958 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6694041-7676-4736-a6d7-1854137deeba-logs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.550394 master-0 kubenswrapper[23041]: I0308 00:54:17.549504 23041 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:54:17.550394 master-0 kubenswrapper[23041]: I0308 00:54:17.549534 23041 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7937ab9a4c8d614dfdb5fc98362cfde4f447c9044ef1b15cf0facb998bc5a885/globalmount\"" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.553491 master-0 kubenswrapper[23041]: I0308 00:54:17.552854 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-scripts\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.568887 master-0 kubenswrapper[23041]: I0308 00:54:17.568838 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-config-data\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.569547 master-0 kubenswrapper[23041]: I0308 00:54:17.569519 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-internal-tls-certs\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.586109 master-0 kubenswrapper[23041]: I0308 00:54:17.586051 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shc2j\" (UniqueName: \"kubernetes.io/projected/e6694041-7676-4736-a6d7-1854137deeba-kube-api-access-shc2j\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.594384 master-0 kubenswrapper[23041]: I0308 00:54:17.593190 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6694041-7676-4736-a6d7-1854137deeba-combined-ca-bundle\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:17.796496 master-0 kubenswrapper[23041]: I0308 00:54:17.791678 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" event={"ID":"184ea98e-2eac-49e3-a090-c412857c1df4","Type":"ContainerStarted","Data":"afb5824a1f58d2d4b1c3ce13f0c9fca6eafa2ca4d4e874bf24d094c4b9522d26"} Mar 08 00:54:17.796496 master-0 kubenswrapper[23041]: I0308 00:54:17.794363 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:17.821239 master-0 kubenswrapper[23041]: I0308 00:54:17.818760 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" podStartSLOduration=3.818742892 podStartE2EDuration="3.818742892s" podCreationTimestamp="2026-03-08 00:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:17.818539047 +0000 UTC m=+1363.291375611" watchObservedRunningTime="2026-03-08 00:54:17.818742892 +0000 UTC m=+1363.291579446" Mar 08 00:54:17.914352 master-0 kubenswrapper[23041]: I0308 00:54:17.914280 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-external-api-0"] Mar 08 00:54:18.704314 master-0 kubenswrapper[23041]: I0308 00:54:18.704062 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-7dffdc6989-dw4bq" Mar 08 00:54:18.830826 master-0 kubenswrapper[23041]: I0308 00:54:18.830682 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81e602d-26e5-49ac-92d1-71fea2607868" path="/var/lib/kubelet/pods/c81e602d-26e5-49ac-92d1-71fea2607868/volumes" Mar 08 00:54:18.835311 master-0 kubenswrapper[23041]: I0308 00:54:18.833162 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"65244579-afd4-4973-8b2f-5568ba7974c4","Type":"ContainerStarted","Data":"1507fb2932f33a4566fda69a810c24200c44cd0161f189b38321429d3ccca523"} Mar 08 00:54:18.835311 master-0 kubenswrapper[23041]: I0308 00:54:18.833704 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"65244579-afd4-4973-8b2f-5568ba7974c4","Type":"ContainerStarted","Data":"4f90e4de25c063ff50bfe1c647033035e73351ba050c181fec1f4a8d38fa45bf"} Mar 08 00:54:18.949322 master-0 kubenswrapper[23041]: I0308 00:54:18.948426 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:19.842390 master-0 kubenswrapper[23041]: I0308 00:54:19.837811 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-external-api-0" event={"ID":"65244579-afd4-4973-8b2f-5568ba7974c4","Type":"ContainerStarted","Data":"98c0629a90ba93d91ebe3fb528f51d7fd691ef09f02f605fb069b045e221882c"} Mar 08 00:54:19.903055 master-0 kubenswrapper[23041]: I0308 00:54:19.902948 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1280f-default-external-api-0" podStartSLOduration=4.902922305 podStartE2EDuration="4.902922305s" podCreationTimestamp="2026-03-08 00:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:19.883295845 +0000 UTC m=+1365.356132429" watchObservedRunningTime="2026-03-08 00:54:19.902922305 +0000 UTC m=+1365.375758869" Mar 08 00:54:20.721481 master-0 kubenswrapper[23041]: I0308 00:54:20.721439 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7512aa1f-2488-47af-b61f-945377082816\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f5214b97-526b-4460-9399-392e4c5e0c2e\") pod \"glance-1280f-default-internal-api-0\" (UID: \"e6694041-7676-4736-a6d7-1854137deeba\") " pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:20.930867 master-0 kubenswrapper[23041]: I0308 00:54:20.930488 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:25.084559 master-0 kubenswrapper[23041]: I0308 00:54:25.084482 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:54:25.845754 master-0 kubenswrapper[23041]: I0308 00:54:25.845674 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:54:25.846076 master-0 kubenswrapper[23041]: I0308 00:54:25.846006 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="dnsmasq-dns" containerID="cri-o://0d9413b60d51154485a6e003c7e9fbc007d394072de70e65f8c8a2c6a295ce84" gracePeriod=10 Mar 08 00:54:26.974283 master-0 kubenswrapper[23041]: I0308 00:54:26.964464 23041 generic.go:334] "Generic (PLEG): container finished" podID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerID="0d9413b60d51154485a6e003c7e9fbc007d394072de70e65f8c8a2c6a295ce84" exitCode=0 Mar 08 00:54:26.974283 master-0 kubenswrapper[23041]: I0308 00:54:26.964528 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" event={"ID":"9a83bcf0-62ae-4284-b870-14ba623be2e1","Type":"ContainerDied","Data":"0d9413b60d51154485a6e003c7e9fbc007d394072de70e65f8c8a2c6a295ce84"} Mar 08 00:54:27.072273 master-0 kubenswrapper[23041]: I0308 00:54:27.072191 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:27.072557 master-0 kubenswrapper[23041]: I0308 00:54:27.072285 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:27.106258 master-0 kubenswrapper[23041]: I0308 00:54:27.106128 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:27.119704 master-0 kubenswrapper[23041]: I0308 00:54:27.119653 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:27.513606 master-0 kubenswrapper[23041]: I0308 00:54:27.513525 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667445 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrwh\" (UniqueName: \"kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667564 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667589 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667613 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667656 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.668976 master-0 kubenswrapper[23041]: I0308 00:54:27.667714 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config\") pod \"9a83bcf0-62ae-4284-b870-14ba623be2e1\" (UID: \"9a83bcf0-62ae-4284-b870-14ba623be2e1\") " Mar 08 00:54:27.711706 master-0 kubenswrapper[23041]: I0308 00:54:27.710658 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh" (OuterVolumeSpecName: "kube-api-access-glrwh") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "kube-api-access-glrwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:27.745052 master-0 kubenswrapper[23041]: I0308 00:54:27.744738 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:27.747107 master-0 kubenswrapper[23041]: I0308 00:54:27.746963 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:27.751379 master-0 kubenswrapper[23041]: I0308 00:54:27.751328 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:27.759679 master-0 kubenswrapper[23041]: I0308 00:54:27.759619 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config" (OuterVolumeSpecName: "config") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:27.786712 master-0 kubenswrapper[23041]: I0308 00:54:27.773642 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:27.786712 master-0 kubenswrapper[23041]: I0308 00:54:27.773704 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:27.786712 master-0 kubenswrapper[23041]: I0308 00:54:27.773716 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:27.786712 master-0 kubenswrapper[23041]: I0308 00:54:27.773725 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:27.786712 master-0 kubenswrapper[23041]: I0308 00:54:27.773734 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrwh\" (UniqueName: \"kubernetes.io/projected/9a83bcf0-62ae-4284-b870-14ba623be2e1-kube-api-access-glrwh\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:27.799121 master-0 kubenswrapper[23041]: I0308 00:54:27.798945 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a83bcf0-62ae-4284-b870-14ba623be2e1" (UID: "9a83bcf0-62ae-4284-b870-14ba623be2e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:54:27.877125 master-0 kubenswrapper[23041]: I0308 00:54:27.876317 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a83bcf0-62ae-4284-b870-14ba623be2e1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:28.006014 master-0 kubenswrapper[23041]: I0308 00:54:28.005914 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" event={"ID":"9a83bcf0-62ae-4284-b870-14ba623be2e1","Type":"ContainerDied","Data":"b733f536418a0c47d4085a9e572557c0b9c02c718e16ac29e9a233f891167c60"} Mar 08 00:54:28.006714 master-0 kubenswrapper[23041]: I0308 00:54:28.006029 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:28.006714 master-0 kubenswrapper[23041]: I0308 00:54:28.006056 23041 scope.go:117] "RemoveContainer" containerID="0d9413b60d51154485a6e003c7e9fbc007d394072de70e65f8c8a2c6a295ce84" Mar 08 00:54:28.006714 master-0 kubenswrapper[23041]: I0308 00:54:28.006282 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf78b7-cqc9l" Mar 08 00:54:28.008522 master-0 kubenswrapper[23041]: I0308 00:54:28.008473 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:28.423347 master-0 kubenswrapper[23041]: I0308 00:54:28.423089 23041 scope.go:117] "RemoveContainer" containerID="2908f761aa1e9d490c9c35d272e8c239ec37bf7e2bd973e5b3d3a3ba3e72c66d" Mar 08 00:54:29.017486 master-0 kubenswrapper[23041]: I0308 00:54:29.017411 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"8743ecc8e441a1be02a5a33ce6e46345de67d9b5a580d996ad8b953f8e919372"} Mar 08 00:54:29.020028 master-0 kubenswrapper[23041]: I0308 00:54:29.019992 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"43a61c38-6179-4e19-9027-afc0efca9ea6","Type":"ContainerStarted","Data":"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9"} Mar 08 00:54:29.020159 master-0 kubenswrapper[23041]: I0308 00:54:29.020124 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-inspector-0" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="inspector-pxe-init" containerID="cri-o://05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9" gracePeriod=60 Mar 08 00:54:29.064061 master-0 kubenswrapper[23041]: I0308 00:54:29.063999 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:54:29.088810 master-0 kubenswrapper[23041]: I0308 00:54:29.087680 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bf78b7-cqc9l"] Mar 08 00:54:29.187246 master-0 kubenswrapper[23041]: W0308 00:54:29.179680 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6694041_7676_4736_a6d7_1854137deeba.slice/crio-38a7faadf64ba27ba8e605493bca4b27cb7a15282e5f74d26994f2e3d73c814a WatchSource:0}: Error finding container 38a7faadf64ba27ba8e605493bca4b27cb7a15282e5f74d26994f2e3d73c814a: Status 404 returned error can't find the container with id 38a7faadf64ba27ba8e605493bca4b27cb7a15282e5f74d26994f2e3d73c814a Mar 08 00:54:29.190783 master-0 kubenswrapper[23041]: I0308 00:54:29.190693 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1280f-default-internal-api-0"] Mar 08 00:54:29.981713 master-0 kubenswrapper[23041]: I0308 00:54:29.981655 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:30.052144 master-0 kubenswrapper[23041]: I0308 00:54:30.052086 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" event={"ID":"65e42302-3592-46fd-b7a9-b125bf61382b","Type":"ContainerStarted","Data":"ec9ba3fc66949789bd7323a7f91558f505dd61226bf32af867cbd79f94938620"} Mar 08 00:54:30.057188 master-0 kubenswrapper[23041]: I0308 00:54:30.057148 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"e6694041-7676-4736-a6d7-1854137deeba","Type":"ContainerStarted","Data":"b11aa0af5d61b17015ca7bfc5e08a5393bcddd7261e20a7b43745c6c1355fed3"} Mar 08 00:54:30.057570 master-0 kubenswrapper[23041]: I0308 00:54:30.057539 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"e6694041-7676-4736-a6d7-1854137deeba","Type":"ContainerStarted","Data":"38a7faadf64ba27ba8e605493bca4b27cb7a15282e5f74d26994f2e3d73c814a"} Mar 08 00:54:30.060110 master-0 kubenswrapper[23041]: I0308 00:54:30.060081 23041 generic.go:334] "Generic (PLEG): container finished" podID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerID="05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9" exitCode=0 Mar 08 00:54:30.060353 master-0 kubenswrapper[23041]: I0308 00:54:30.060334 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:54:30.060447 master-0 kubenswrapper[23041]: I0308 00:54:30.060434 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:54:30.060998 master-0 kubenswrapper[23041]: I0308 00:54:30.060925 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"43a61c38-6179-4e19-9027-afc0efca9ea6","Type":"ContainerDied","Data":"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9"} Mar 08 00:54:30.061086 master-0 kubenswrapper[23041]: I0308 00:54:30.061018 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"43a61c38-6179-4e19-9027-afc0efca9ea6","Type":"ContainerDied","Data":"b9b1c3bef1019fe259fb9b996500033b8b035f452ead2834ce325428047a86c6"} Mar 08 00:54:30.061086 master-0 kubenswrapper[23041]: I0308 00:54:30.061052 23041 scope.go:117] "RemoveContainer" containerID="05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9" Mar 08 00:54:30.061480 master-0 kubenswrapper[23041]: I0308 00:54:30.060966 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:30.081391 master-0 kubenswrapper[23041]: I0308 00:54:30.078351 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" podStartSLOduration=3.048800129 podStartE2EDuration="17.078327273s" podCreationTimestamp="2026-03-08 00:54:13 +0000 UTC" firstStartedPulling="2026-03-08 00:54:14.621664282 +0000 UTC m=+1360.094500826" lastFinishedPulling="2026-03-08 00:54:28.651191416 +0000 UTC m=+1374.124027970" observedRunningTime="2026-03-08 00:54:30.074388586 +0000 UTC m=+1375.547225140" watchObservedRunningTime="2026-03-08 00:54:30.078327273 +0000 UTC m=+1375.551163837" Mar 08 00:54:30.108845 master-0 kubenswrapper[23041]: I0308 00:54:30.108631 23041 scope.go:117] "RemoveContainer" containerID="351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01" Mar 08 00:54:30.138248 master-0 kubenswrapper[23041]: I0308 00:54:30.137408 23041 scope.go:117] "RemoveContainer" containerID="05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9" Mar 08 00:54:30.138248 master-0 kubenswrapper[23041]: I0308 00:54:30.138063 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.138876 master-0 kubenswrapper[23041]: I0308 00:54:30.138465 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.138876 master-0 kubenswrapper[23041]: E0308 00:54:30.138734 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9\": container with ID starting with 05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9 not found: ID does not exist" containerID="05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9" Mar 08 00:54:30.138876 master-0 kubenswrapper[23041]: I0308 00:54:30.138765 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9"} err="failed to get container status \"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9\": rpc error: code = NotFound desc = could not find container \"05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9\": container with ID starting with 05f6836f089e9e0e37525236e203cdb2261504ca22149f66a12ba28ea8f78ed9 not found: ID does not exist" Mar 08 00:54:30.138876 master-0 kubenswrapper[23041]: I0308 00:54:30.138792 23041 scope.go:117] "RemoveContainer" containerID="351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01" Mar 08 00:54:30.141241 master-0 kubenswrapper[23041]: I0308 00:54:30.138519 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.141241 master-0 kubenswrapper[23041]: I0308 00:54:30.140063 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.141241 master-0 kubenswrapper[23041]: I0308 00:54:30.140131 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.141241 master-0 kubenswrapper[23041]: I0308 00:54:30.140249 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4r5c\" (UniqueName: \"kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.141241 master-0 kubenswrapper[23041]: I0308 00:54:30.140395 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo\") pod \"43a61c38-6179-4e19-9027-afc0efca9ea6\" (UID: \"43a61c38-6179-4e19-9027-afc0efca9ea6\") " Mar 08 00:54:30.142403 master-0 kubenswrapper[23041]: E0308 00:54:30.142355 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01\": container with ID starting with 351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01 not found: ID does not exist" containerID="351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01" Mar 08 00:54:30.142498 master-0 kubenswrapper[23041]: I0308 00:54:30.142411 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01"} err="failed to get container status \"351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01\": rpc error: code = NotFound desc = could not find container \"351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01\": container with ID starting with 351a4e0a5b520d1816d12f9938534ebde72b3497bcdc71ef10a6fce211ec2e01 not found: ID does not exist" Mar 08 00:54:30.143138 master-0 kubenswrapper[23041]: I0308 00:54:30.143088 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:30.147156 master-0 kubenswrapper[23041]: I0308 00:54:30.147110 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:54:30.150706 master-0 kubenswrapper[23041]: I0308 00:54:30.150579 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config" (OuterVolumeSpecName: "config") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:30.152905 master-0 kubenswrapper[23041]: I0308 00:54:30.152467 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c" (OuterVolumeSpecName: "kube-api-access-r4r5c") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "kube-api-access-r4r5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:30.152905 master-0 kubenswrapper[23041]: I0308 00:54:30.152845 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 00:54:30.153489 master-0 kubenswrapper[23041]: I0308 00:54:30.153453 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts" (OuterVolumeSpecName: "scripts") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:30.212259 master-0 kubenswrapper[23041]: I0308 00:54:30.212122 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43a61c38-6179-4e19-9027-afc0efca9ea6" (UID: "43a61c38-6179-4e19-9027-afc0efca9ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243664 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243698 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243709 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243720 23041 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/43a61c38-6179-4e19-9027-afc0efca9ea6-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243733 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4r5c\" (UniqueName: \"kubernetes.io/projected/43a61c38-6179-4e19-9027-afc0efca9ea6-kube-api-access-r4r5c\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243746 23041 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43a61c38-6179-4e19-9027-afc0efca9ea6-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.243740 master-0 kubenswrapper[23041]: I0308 00:54:30.243755 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a61c38-6179-4e19-9027-afc0efca9ea6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:30.501149 master-0 kubenswrapper[23041]: I0308 00:54:30.501099 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:30.506828 master-0 kubenswrapper[23041]: I0308 00:54:30.506770 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:30.656362 master-0 kubenswrapper[23041]: I0308 00:54:30.656297 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:30.660341 master-0 kubenswrapper[23041]: E0308 00:54:30.660270 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="dnsmasq-dns" Mar 08 00:54:30.660613 master-0 kubenswrapper[23041]: I0308 00:54:30.660598 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="dnsmasq-dns" Mar 08 00:54:30.660785 master-0 kubenswrapper[23041]: E0308 00:54:30.660768 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="ironic-python-agent-init" Mar 08 00:54:30.660878 master-0 kubenswrapper[23041]: I0308 00:54:30.660863 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="ironic-python-agent-init" Mar 08 00:54:30.660963 master-0 kubenswrapper[23041]: E0308 00:54:30.660952 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="init" Mar 08 00:54:30.661018 master-0 kubenswrapper[23041]: I0308 00:54:30.661008 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="init" Mar 08 00:54:30.661102 master-0 kubenswrapper[23041]: E0308 00:54:30.661092 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="inspector-pxe-init" Mar 08 00:54:30.661158 master-0 kubenswrapper[23041]: I0308 00:54:30.661148 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="inspector-pxe-init" Mar 08 00:54:30.676044 master-0 kubenswrapper[23041]: I0308 00:54:30.675978 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" containerName="dnsmasq-dns" Mar 08 00:54:30.676289 master-0 kubenswrapper[23041]: I0308 00:54:30.676273 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" containerName="inspector-pxe-init" Mar 08 00:54:30.720245 master-0 kubenswrapper[23041]: I0308 00:54:30.710493 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:30.721418 master-0 kubenswrapper[23041]: I0308 00:54:30.712897 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:30.732241 master-0 kubenswrapper[23041]: I0308 00:54:30.725810 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 00:54:30.732241 master-0 kubenswrapper[23041]: I0308 00:54:30.726328 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 00:54:30.732241 master-0 kubenswrapper[23041]: I0308 00:54:30.726667 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 08 00:54:30.732241 master-0 kubenswrapper[23041]: I0308 00:54:30.729654 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 00:54:30.732241 master-0 kubenswrapper[23041]: I0308 00:54:30.729885 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 08 00:54:30.828227 master-0 kubenswrapper[23041]: I0308 00:54:30.821902 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr689\" (UniqueName: \"kubernetes.io/projected/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-kube-api-access-dr689\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.828227 master-0 kubenswrapper[23041]: I0308 00:54:30.822001 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.828227 master-0 kubenswrapper[23041]: I0308 00:54:30.822085 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.828227 master-0 kubenswrapper[23041]: I0308 00:54:30.822115 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.828227 master-0 kubenswrapper[23041]: I0308 00:54:30.822175 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.832240 master-0 kubenswrapper[23041]: I0308 00:54:30.829834 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-scripts\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.832240 master-0 kubenswrapper[23041]: I0308 00:54:30.830104 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.832240 master-0 kubenswrapper[23041]: I0308 00:54:30.830275 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-config\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.832240 master-0 kubenswrapper[23041]: I0308 00:54:30.830323 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.878241 master-0 kubenswrapper[23041]: I0308 00:54:30.876350 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a61c38-6179-4e19-9027-afc0efca9ea6" path="/var/lib/kubelet/pods/43a61c38-6179-4e19-9027-afc0efca9ea6/volumes" Mar 08 00:54:30.878241 master-0 kubenswrapper[23041]: I0308 00:54:30.877291 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a83bcf0-62ae-4284-b870-14ba623be2e1" path="/var/lib/kubelet/pods/9a83bcf0-62ae-4284-b870-14ba623be2e1/volumes" Mar 08 00:54:30.933250 master-0 kubenswrapper[23041]: I0308 00:54:30.933163 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.933482 master-0 kubenswrapper[23041]: I0308 00:54:30.933289 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.933482 master-0 kubenswrapper[23041]: I0308 00:54:30.933334 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-scripts\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.934737 master-0 kubenswrapper[23041]: I0308 00:54:30.934676 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.934998 master-0 kubenswrapper[23041]: I0308 00:54:30.934971 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-config\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.935064 master-0 kubenswrapper[23041]: I0308 00:54:30.935043 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.935737 master-0 kubenswrapper[23041]: I0308 00:54:30.935647 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr689\" (UniqueName: \"kubernetes.io/projected/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-kube-api-access-dr689\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.935737 master-0 kubenswrapper[23041]: I0308 00:54:30.935725 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.936264 master-0 kubenswrapper[23041]: I0308 00:54:30.935959 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.937243 master-0 kubenswrapper[23041]: I0308 00:54:30.936710 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.937243 master-0 kubenswrapper[23041]: I0308 00:54:30.937043 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.938492 master-0 kubenswrapper[23041]: I0308 00:54:30.938423 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-scripts\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.942789 master-0 kubenswrapper[23041]: I0308 00:54:30.942745 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.949318 master-0 kubenswrapper[23041]: I0308 00:54:30.949051 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.950151 master-0 kubenswrapper[23041]: I0308 00:54:30.950112 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-config\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.951167 master-0 kubenswrapper[23041]: I0308 00:54:30.951105 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.952104 master-0 kubenswrapper[23041]: I0308 00:54:30.952046 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:30.962873 master-0 kubenswrapper[23041]: I0308 00:54:30.962800 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr689\" (UniqueName: \"kubernetes.io/projected/e9ed55cb-51ca-49ca-89fb-de02ac4605c5-kube-api-access-dr689\") pod \"ironic-inspector-0\" (UID: \"e9ed55cb-51ca-49ca-89fb-de02ac4605c5\") " pod="openstack/ironic-inspector-0" Mar 08 00:54:31.070311 master-0 kubenswrapper[23041]: I0308 00:54:31.067343 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 00:54:31.091271 master-0 kubenswrapper[23041]: I0308 00:54:31.089715 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1280f-default-internal-api-0" event={"ID":"e6694041-7676-4736-a6d7-1854137deeba","Type":"ContainerStarted","Data":"f5abe9e91f22c36887826907dd4dffb2c08f3acab65a6a8b80cf0b61c2a0985c"} Mar 08 00:54:31.146299 master-0 kubenswrapper[23041]: I0308 00:54:31.138994 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1280f-default-internal-api-0" podStartSLOduration=14.138964472 podStartE2EDuration="14.138964472s" podCreationTimestamp="2026-03-08 00:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:31.112337371 +0000 UTC m=+1376.585173935" watchObservedRunningTime="2026-03-08 00:54:31.138964472 +0000 UTC m=+1376.611801036" Mar 08 00:54:32.131280 master-0 kubenswrapper[23041]: I0308 00:54:32.129841 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 00:54:32.147166 master-0 kubenswrapper[23041]: W0308 00:54:32.146938 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ed55cb_51ca_49ca_89fb_de02ac4605c5.slice/crio-aa7e82a21a82cf4c1a23f94352c6edeac5b77527a44277985c325492359d5bb5 WatchSource:0}: Error finding container aa7e82a21a82cf4c1a23f94352c6edeac5b77527a44277985c325492359d5bb5: Status 404 returned error can't find the container with id aa7e82a21a82cf4c1a23f94352c6edeac5b77527a44277985c325492359d5bb5 Mar 08 00:54:32.662990 master-0 kubenswrapper[23041]: I0308 00:54:32.662926 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:32.663265 master-0 kubenswrapper[23041]: I0308 00:54:32.663025 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:54:32.674430 master-0 kubenswrapper[23041]: I0308 00:54:32.674025 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-external-api-0" Mar 08 00:54:33.128421 master-0 kubenswrapper[23041]: I0308 00:54:33.114984 23041 generic.go:334] "Generic (PLEG): container finished" podID="e9ed55cb-51ca-49ca-89fb-de02ac4605c5" containerID="715ef4207ff7c2072d3f44cb166a4dc41103a8d094dc16ccdf62ebb011f0e9f8" exitCode=0 Mar 08 00:54:33.128421 master-0 kubenswrapper[23041]: I0308 00:54:33.116312 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerDied","Data":"715ef4207ff7c2072d3f44cb166a4dc41103a8d094dc16ccdf62ebb011f0e9f8"} Mar 08 00:54:33.128421 master-0 kubenswrapper[23041]: I0308 00:54:33.116336 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"aa7e82a21a82cf4c1a23f94352c6edeac5b77527a44277985c325492359d5bb5"} Mar 08 00:54:34.128358 master-0 kubenswrapper[23041]: I0308 00:54:34.128305 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"a1fd9f2d5a471780bff1fb66040274d5d676fe8d36e3389ca74c95eebe8bd83d"} Mar 08 00:54:35.146368 master-0 kubenswrapper[23041]: I0308 00:54:35.146274 23041 generic.go:334] "Generic (PLEG): container finished" podID="e9ed55cb-51ca-49ca-89fb-de02ac4605c5" containerID="a1fd9f2d5a471780bff1fb66040274d5d676fe8d36e3389ca74c95eebe8bd83d" exitCode=0 Mar 08 00:54:35.146368 master-0 kubenswrapper[23041]: I0308 00:54:35.146373 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerDied","Data":"a1fd9f2d5a471780bff1fb66040274d5d676fe8d36e3389ca74c95eebe8bd83d"} Mar 08 00:54:36.159696 master-0 kubenswrapper[23041]: I0308 00:54:36.159633 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"c3b32182cbbf65b0b96479a2a716ae036853944325d3392e566e86ae43d3fbb6"} Mar 08 00:54:37.180647 master-0 kubenswrapper[23041]: I0308 00:54:37.180556 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"6335f7ee706a3578b2f377ce002e2c1d3d3fa0bbd9e8d43d9c474ef39a3f7b5a"} Mar 08 00:54:37.180647 master-0 kubenswrapper[23041]: I0308 00:54:37.180654 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"e174abe46595b2efef0b87521fbc6a8416fa7068bf360d3cf03bda5e661abd4a"} Mar 08 00:54:37.181299 master-0 kubenswrapper[23041]: I0308 00:54:37.180667 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"dfc6cb33827605485c591bd7e4e911392b9bc4574a4bfede58bb09c192f26e6e"} Mar 08 00:54:38.198985 master-0 kubenswrapper[23041]: I0308 00:54:38.198915 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"e9ed55cb-51ca-49ca-89fb-de02ac4605c5","Type":"ContainerStarted","Data":"e86cedab1effe718e724a47b828e7b342a0c829c533957a7a41d165c7956ac5b"} Mar 08 00:54:38.199928 master-0 kubenswrapper[23041]: I0308 00:54:38.199178 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 00:54:38.257060 master-0 kubenswrapper[23041]: I0308 00:54:38.256935 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=8.25691076 podStartE2EDuration="8.25691076s" podCreationTimestamp="2026-03-08 00:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:38.245897991 +0000 UTC m=+1383.718734565" watchObservedRunningTime="2026-03-08 00:54:38.25691076 +0000 UTC m=+1383.729747314" Mar 08 00:54:39.208549 master-0 kubenswrapper[23041]: I0308 00:54:39.208420 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 00:54:40.322024 master-0 kubenswrapper[23041]: I0308 00:54:40.321974 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 00:54:40.931422 master-0 kubenswrapper[23041]: I0308 00:54:40.931359 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:40.931422 master-0 kubenswrapper[23041]: I0308 00:54:40.931418 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:40.968559 master-0 kubenswrapper[23041]: I0308 00:54:40.968439 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:40.985500 master-0 kubenswrapper[23041]: I0308 00:54:40.985428 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:41.068783 master-0 kubenswrapper[23041]: I0308 00:54:41.068700 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.068783 master-0 kubenswrapper[23041]: I0308 00:54:41.068784 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.069235 master-0 kubenswrapper[23041]: I0308 00:54:41.068805 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.069235 master-0 kubenswrapper[23041]: I0308 00:54:41.068821 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.132661 master-0 kubenswrapper[23041]: I0308 00:54:41.132600 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.136374 master-0 kubenswrapper[23041]: I0308 00:54:41.136342 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.236038 master-0 kubenswrapper[23041]: I0308 00:54:41.235867 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:41.236038 master-0 kubenswrapper[23041]: I0308 00:54:41.235945 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:41.236318 master-0 kubenswrapper[23041]: I0308 00:54:41.236160 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.247112 master-0 kubenswrapper[23041]: I0308 00:54:41.246681 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 00:54:41.266281 master-0 kubenswrapper[23041]: I0308 00:54:41.252195 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 00:54:43.266585 master-0 kubenswrapper[23041]: I0308 00:54:43.266369 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:43.268999 master-0 kubenswrapper[23041]: I0308 00:54:43.268969 23041 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:54:43.279952 master-0 kubenswrapper[23041]: I0308 00:54:43.279896 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-1280f-default-internal-api-0" Mar 08 00:54:48.345590 master-0 kubenswrapper[23041]: I0308 00:54:48.345524 23041 generic.go:334] "Generic (PLEG): container finished" podID="65e42302-3592-46fd-b7a9-b125bf61382b" containerID="ec9ba3fc66949789bd7323a7f91558f505dd61226bf32af867cbd79f94938620" exitCode=0 Mar 08 00:54:48.345590 master-0 kubenswrapper[23041]: I0308 00:54:48.345583 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" event={"ID":"65e42302-3592-46fd-b7a9-b125bf61382b","Type":"ContainerDied","Data":"ec9ba3fc66949789bd7323a7f91558f505dd61226bf32af867cbd79f94938620"} Mar 08 00:54:49.782063 master-0 kubenswrapper[23041]: I0308 00:54:49.780405 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:49.872178 master-0 kubenswrapper[23041]: I0308 00:54:49.869344 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts\") pod \"65e42302-3592-46fd-b7a9-b125bf61382b\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " Mar 08 00:54:49.872178 master-0 kubenswrapper[23041]: I0308 00:54:49.869446 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data\") pod \"65e42302-3592-46fd-b7a9-b125bf61382b\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " Mar 08 00:54:49.872178 master-0 kubenswrapper[23041]: I0308 00:54:49.869752 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle\") pod \"65e42302-3592-46fd-b7a9-b125bf61382b\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " Mar 08 00:54:49.872178 master-0 kubenswrapper[23041]: I0308 00:54:49.869924 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvbhf\" (UniqueName: \"kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf\") pod \"65e42302-3592-46fd-b7a9-b125bf61382b\" (UID: \"65e42302-3592-46fd-b7a9-b125bf61382b\") " Mar 08 00:54:49.874282 master-0 kubenswrapper[23041]: I0308 00:54:49.874127 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts" (OuterVolumeSpecName: "scripts") pod "65e42302-3592-46fd-b7a9-b125bf61382b" (UID: "65e42302-3592-46fd-b7a9-b125bf61382b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:49.894563 master-0 kubenswrapper[23041]: I0308 00:54:49.892965 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf" (OuterVolumeSpecName: "kube-api-access-bvbhf") pod "65e42302-3592-46fd-b7a9-b125bf61382b" (UID: "65e42302-3592-46fd-b7a9-b125bf61382b"). InnerVolumeSpecName "kube-api-access-bvbhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:54:49.915530 master-0 kubenswrapper[23041]: I0308 00:54:49.915446 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data" (OuterVolumeSpecName: "config-data") pod "65e42302-3592-46fd-b7a9-b125bf61382b" (UID: "65e42302-3592-46fd-b7a9-b125bf61382b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:49.924843 master-0 kubenswrapper[23041]: I0308 00:54:49.924749 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65e42302-3592-46fd-b7a9-b125bf61382b" (UID: "65e42302-3592-46fd-b7a9-b125bf61382b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:54:49.975478 master-0 kubenswrapper[23041]: I0308 00:54:49.975411 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:49.975478 master-0 kubenswrapper[23041]: I0308 00:54:49.975461 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:49.975478 master-0 kubenswrapper[23041]: I0308 00:54:49.975477 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e42302-3592-46fd-b7a9-b125bf61382b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:49.975478 master-0 kubenswrapper[23041]: I0308 00:54:49.975493 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvbhf\" (UniqueName: \"kubernetes.io/projected/65e42302-3592-46fd-b7a9-b125bf61382b-kube-api-access-bvbhf\") on node \"master-0\" DevicePath \"\"" Mar 08 00:54:50.377490 master-0 kubenswrapper[23041]: I0308 00:54:50.377396 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" event={"ID":"65e42302-3592-46fd-b7a9-b125bf61382b","Type":"ContainerDied","Data":"bdf9a866d900eeef92a31437989e83717a6864f53eeafc41bf5117510aa72a01"} Mar 08 00:54:50.377490 master-0 kubenswrapper[23041]: I0308 00:54:50.377469 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf9a866d900eeef92a31437989e83717a6864f53eeafc41bf5117510aa72a01" Mar 08 00:54:50.377841 master-0 kubenswrapper[23041]: I0308 00:54:50.377511 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wcsvr" Mar 08 00:54:50.548934 master-0 kubenswrapper[23041]: I0308 00:54:50.548851 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 00:54:50.549655 master-0 kubenswrapper[23041]: E0308 00:54:50.549616 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e42302-3592-46fd-b7a9-b125bf61382b" containerName="nova-cell0-conductor-db-sync" Mar 08 00:54:50.549655 master-0 kubenswrapper[23041]: I0308 00:54:50.549644 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e42302-3592-46fd-b7a9-b125bf61382b" containerName="nova-cell0-conductor-db-sync" Mar 08 00:54:50.589381 master-0 kubenswrapper[23041]: I0308 00:54:50.564674 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e42302-3592-46fd-b7a9-b125bf61382b" containerName="nova-cell0-conductor-db-sync" Mar 08 00:54:50.589381 master-0 kubenswrapper[23041]: I0308 00:54:50.565791 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 00:54:50.589381 master-0 kubenswrapper[23041]: I0308 00:54:50.565917 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.594131 master-0 kubenswrapper[23041]: I0308 00:54:50.593340 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 00:54:50.697609 master-0 kubenswrapper[23041]: I0308 00:54:50.697404 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptr5t\" (UniqueName: \"kubernetes.io/projected/fd1d89af-a975-466c-b6cc-59d800d75ae5-kube-api-access-ptr5t\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.697609 master-0 kubenswrapper[23041]: I0308 00:54:50.697551 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.697991 master-0 kubenswrapper[23041]: I0308 00:54:50.697667 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.800643 master-0 kubenswrapper[23041]: I0308 00:54:50.800560 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptr5t\" (UniqueName: \"kubernetes.io/projected/fd1d89af-a975-466c-b6cc-59d800d75ae5-kube-api-access-ptr5t\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.800643 master-0 kubenswrapper[23041]: I0308 00:54:50.800628 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.801705 master-0 kubenswrapper[23041]: I0308 00:54:50.800685 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.805050 master-0 kubenswrapper[23041]: I0308 00:54:50.804005 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.805329 master-0 kubenswrapper[23041]: I0308 00:54:50.805254 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1d89af-a975-466c-b6cc-59d800d75ae5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.819247 master-0 kubenswrapper[23041]: I0308 00:54:50.819154 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptr5t\" (UniqueName: \"kubernetes.io/projected/fd1d89af-a975-466c-b6cc-59d800d75ae5-kube-api-access-ptr5t\") pod \"nova-cell0-conductor-0\" (UID: \"fd1d89af-a975-466c-b6cc-59d800d75ae5\") " pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:50.908445 master-0 kubenswrapper[23041]: I0308 00:54:50.908360 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:51.416699 master-0 kubenswrapper[23041]: W0308 00:54:51.416601 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd1d89af_a975_466c_b6cc_59d800d75ae5.slice/crio-2ccb5e8f1e5a3b8f5452944e7b76f57b2937cd175c4a9f5118a31593bc1a63fa WatchSource:0}: Error finding container 2ccb5e8f1e5a3b8f5452944e7b76f57b2937cd175c4a9f5118a31593bc1a63fa: Status 404 returned error can't find the container with id 2ccb5e8f1e5a3b8f5452944e7b76f57b2937cd175c4a9f5118a31593bc1a63fa Mar 08 00:54:51.438478 master-0 kubenswrapper[23041]: I0308 00:54:51.434940 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 00:54:52.410181 master-0 kubenswrapper[23041]: I0308 00:54:52.410052 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fd1d89af-a975-466c-b6cc-59d800d75ae5","Type":"ContainerStarted","Data":"879fd9bd242c65ab28d73db95a1bbdf165b3d49fb2f88469b85a1bf3671bba9b"} Mar 08 00:54:52.410181 master-0 kubenswrapper[23041]: I0308 00:54:52.410110 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fd1d89af-a975-466c-b6cc-59d800d75ae5","Type":"ContainerStarted","Data":"2ccb5e8f1e5a3b8f5452944e7b76f57b2937cd175c4a9f5118a31593bc1a63fa"} Mar 08 00:54:52.411006 master-0 kubenswrapper[23041]: I0308 00:54:52.410297 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 00:54:52.450293 master-0 kubenswrapper[23041]: I0308 00:54:52.442452 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.442434697 podStartE2EDuration="2.442434697s" podCreationTimestamp="2026-03-08 00:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:54:52.429784548 +0000 UTC m=+1397.902621102" watchObservedRunningTime="2026-03-08 00:54:52.442434697 +0000 UTC m=+1397.915271251" Mar 08 00:55:00.971329 master-0 kubenswrapper[23041]: I0308 00:55:00.971036 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 00:55:01.638526 master-0 kubenswrapper[23041]: I0308 00:55:01.638464 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xkrf7"] Mar 08 00:55:01.640948 master-0 kubenswrapper[23041]: I0308 00:55:01.640914 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.644164 master-0 kubenswrapper[23041]: I0308 00:55:01.644130 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 00:55:01.647903 master-0 kubenswrapper[23041]: I0308 00:55:01.647873 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 00:55:01.715729 master-0 kubenswrapper[23041]: I0308 00:55:01.674764 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xkrf7"] Mar 08 00:55:01.723385 master-0 kubenswrapper[23041]: I0308 00:55:01.723311 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.723624 master-0 kubenswrapper[23041]: I0308 00:55:01.723410 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.723624 master-0 kubenswrapper[23041]: I0308 00:55:01.723533 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.723624 master-0 kubenswrapper[23041]: I0308 00:55:01.723565 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7ss\" (UniqueName: \"kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.817430 master-0 kubenswrapper[23041]: I0308 00:55:01.814727 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 00:55:01.818546 master-0 kubenswrapper[23041]: I0308 00:55:01.818398 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:01.830223 master-0 kubenswrapper[23041]: I0308 00:55:01.829556 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.830223 master-0 kubenswrapper[23041]: I0308 00:55:01.829693 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.830223 master-0 kubenswrapper[23041]: I0308 00:55:01.829795 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:01.830223 master-0 kubenswrapper[23041]: I0308 00:55:01.829861 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.830223 master-0 kubenswrapper[23041]: I0308 00:55:01.829892 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7ss\" (UniqueName: \"kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.834191 master-0 kubenswrapper[23041]: I0308 00:55:01.834148 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 08 00:55:01.843611 master-0 kubenswrapper[23041]: I0308 00:55:01.843541 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.849984 master-0 kubenswrapper[23041]: I0308 00:55:01.849896 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.875229 master-0 kubenswrapper[23041]: I0308 00:55:01.870730 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.885242 master-0 kubenswrapper[23041]: I0308 00:55:01.883385 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 00:55:01.886282 master-0 kubenswrapper[23041]: I0308 00:55:01.886185 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7ss\" (UniqueName: \"kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss\") pod \"nova-cell0-cell-mapping-xkrf7\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.932766 master-0 kubenswrapper[23041]: I0308 00:55:01.932608 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:01.932766 master-0 kubenswrapper[23041]: I0308 00:55:01.932710 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88b2r\" (UniqueName: \"kubernetes.io/projected/4f052268-95a2-4ee5-9955-0e851ca8894a-kube-api-access-88b2r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:01.933041 master-0 kubenswrapper[23041]: I0308 00:55:01.932809 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:01.964284 master-0 kubenswrapper[23041]: I0308 00:55:01.964170 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:01.976238 master-0 kubenswrapper[23041]: I0308 00:55:01.976165 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:02.047972 master-0 kubenswrapper[23041]: I0308 00:55:02.045174 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:02.047972 master-0 kubenswrapper[23041]: I0308 00:55:02.045303 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88b2r\" (UniqueName: \"kubernetes.io/projected/4f052268-95a2-4ee5-9955-0e851ca8894a-kube-api-access-88b2r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:02.066061 master-0 kubenswrapper[23041]: I0308 00:55:02.056831 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f052268-95a2-4ee5-9955-0e851ca8894a-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:03.478540 master-0 kubenswrapper[23041]: I0308 00:55:03.475534 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88b2r\" (UniqueName: \"kubernetes.io/projected/4f052268-95a2-4ee5-9955-0e851ca8894a-kube-api-access-88b2r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"4f052268-95a2-4ee5-9955-0e851ca8894a\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:03.485453 master-0 kubenswrapper[23041]: I0308 00:55:03.485261 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xkrf7"] Mar 08 00:55:03.591355 master-0 kubenswrapper[23041]: I0308 00:55:03.591273 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:03.592292 master-0 kubenswrapper[23041]: I0308 00:55:03.592101 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xkrf7" event={"ID":"6d9dfee3-973a-4663-9df5-1ea29d47096a","Type":"ContainerStarted","Data":"c1451ea855ff6aaf7b1095ac17d89ccea09a6224679e4ff61e15b2e63c4f9020"} Mar 08 00:55:03.646380 master-0 kubenswrapper[23041]: I0308 00:55:03.645962 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:03.648401 master-0 kubenswrapper[23041]: I0308 00:55:03.648363 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:03.652996 master-0 kubenswrapper[23041]: I0308 00:55:03.652955 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 00:55:03.665578 master-0 kubenswrapper[23041]: I0308 00:55:03.665496 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:03.668771 master-0 kubenswrapper[23041]: I0308 00:55:03.668704 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:03.671480 master-0 kubenswrapper[23041]: I0308 00:55:03.671450 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.867176 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.867439 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.867819 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.867931 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.867985 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.868366 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7bs\" (UniqueName: \"kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.868574 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpnt\" (UniqueName: \"kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.889230 master-0 kubenswrapper[23041]: I0308 00:55:03.883279 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:03.897238 master-0 kubenswrapper[23041]: I0308 00:55:03.894196 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:03.972253 master-0 kubenswrapper[23041]: I0308 00:55:03.972144 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.972637 master-0 kubenswrapper[23041]: I0308 00:55:03.972340 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.972637 master-0 kubenswrapper[23041]: I0308 00:55:03.972384 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.972637 master-0 kubenswrapper[23041]: I0308 00:55:03.972411 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.972637 master-0 kubenswrapper[23041]: I0308 00:55:03.972530 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7bs\" (UniqueName: \"kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.972637 master-0 kubenswrapper[23041]: I0308 00:55:03.972601 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpnt\" (UniqueName: \"kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.972839 master-0 kubenswrapper[23041]: I0308 00:55:03.972670 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.980993 master-0 kubenswrapper[23041]: I0308 00:55:03.980913 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.986125 master-0 kubenswrapper[23041]: I0308 00:55:03.986014 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:03.993125 master-0 kubenswrapper[23041]: I0308 00:55:03.992272 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:03.994586 master-0 kubenswrapper[23041]: I0308 00:55:03.994540 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:04.030508 master-0 kubenswrapper[23041]: I0308 00:55:04.030430 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:04.114241 master-0 kubenswrapper[23041]: I0308 00:55:04.109026 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7bs\" (UniqueName: \"kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs\") pod \"nova-scheduler-0\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:04.114241 master-0 kubenswrapper[23041]: I0308 00:55:04.109326 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:04.114241 master-0 kubenswrapper[23041]: I0308 00:55:04.112159 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:04.123226 master-0 kubenswrapper[23041]: I0308 00:55:04.120966 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpnt\" (UniqueName: \"kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt\") pod \"nova-api-0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " pod="openstack/nova-api-0" Mar 08 00:55:04.144292 master-0 kubenswrapper[23041]: I0308 00:55:04.141728 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 00:55:04.232908 master-0 kubenswrapper[23041]: I0308 00:55:04.192373 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:04.232908 master-0 kubenswrapper[23041]: I0308 00:55:04.232840 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.245560 master-0 kubenswrapper[23041]: I0308 00:55:04.235291 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 00:55:04.310367 master-0 kubenswrapper[23041]: I0308 00:55:04.310267 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.311152 master-0 kubenswrapper[23041]: I0308 00:55:04.310547 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.311152 master-0 kubenswrapper[23041]: I0308 00:55:04.310613 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdpv\" (UniqueName: \"kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.311152 master-0 kubenswrapper[23041]: I0308 00:55:04.310819 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.316166 master-0 kubenswrapper[23041]: I0308 00:55:04.315965 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:04.347467 master-0 kubenswrapper[23041]: I0308 00:55:04.347387 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:04.355448 master-0 kubenswrapper[23041]: I0308 00:55:04.355249 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:04.389304 master-0 kubenswrapper[23041]: I0308 00:55:04.389262 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.533632 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.533740 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.533880 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.534031 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29nps\" (UniqueName: \"kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.534255 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.534481 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.536335 master-0 kubenswrapper[23041]: I0308 00:55:04.534540 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdpv\" (UniqueName: \"kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.537723 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.588652 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7wljq"] Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.591428 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.598681 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.598905 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.616016 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.618975 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdpv\" (UniqueName: \"kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.619564 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640307 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmth\" (UniqueName: \"kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640360 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640444 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640476 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640528 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640572 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29nps\" (UniqueName: \"kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.640606 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.666032 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7wljq"] Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.666122 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xkrf7" event={"ID":"6d9dfee3-973a-4663-9df5-1ea29d47096a","Type":"ContainerStarted","Data":"839f13e1d10ecbdb92ec49006f729adb8f2d656e4b95322e80df18e0b6744acd"} Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.683902 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.689164 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.700967 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.703110 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29nps\" (UniqueName: \"kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.728180 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.737082 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.743751 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.743919 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.744030 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmth\" (UniqueName: \"kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.744060 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.751893 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.754396 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.760223 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.774278 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.796535 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmth\" (UniqueName: \"kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth\") pod \"nova-cell1-conductor-db-sync-7wljq\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.851824 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.852673 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.853102 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.853763 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.854039 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.854249 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.854554 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsh22\" (UniqueName: \"kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:04.886934 master-0 kubenswrapper[23041]: I0308 00:55:04.870732 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xkrf7" podStartSLOduration=3.870695689 podStartE2EDuration="3.870695689s" podCreationTimestamp="2026-03-08 00:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:04.723706407 +0000 UTC m=+1410.196542961" watchObservedRunningTime="2026-03-08 00:55:04.870695689 +0000 UTC m=+1410.343532253" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.957748 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.957847 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.957890 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.957949 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.957981 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.958033 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsh22\" (UniqueName: \"kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.960366 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.962574 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.968900 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.970392 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.024749 master-0 kubenswrapper[23041]: I0308 00:55:04.971895 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.034522 master-0 kubenswrapper[23041]: I0308 00:55:05.034462 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:05.037106 master-0 kubenswrapper[23041]: I0308 00:55:05.037039 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsh22\" (UniqueName: \"kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22\") pod \"dnsmasq-dns-67f5b4fdc9-swznp\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.100810 master-0 kubenswrapper[23041]: I0308 00:55:05.096331 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:05.117449 master-0 kubenswrapper[23041]: I0308 00:55:05.107465 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:05.480845 master-0 kubenswrapper[23041]: I0308 00:55:05.466928 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:05.536721 master-0 kubenswrapper[23041]: I0308 00:55:05.536657 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:05.711091 master-0 kubenswrapper[23041]: I0308 00:55:05.711014 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"4f052268-95a2-4ee5-9955-0e851ca8894a","Type":"ContainerStarted","Data":"224c7dc76e5cdcde8ee6c4a62aa0c6c4f6519bf89f476976ad9cd8e68e4b6781"} Mar 08 00:55:05.725593 master-0 kubenswrapper[23041]: I0308 00:55:05.723232 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerStarted","Data":"4e4fbc384040bd08eaddf6bd7d8de442a6e60189315344944f10ed1d16f243ae"} Mar 08 00:55:05.736763 master-0 kubenswrapper[23041]: I0308 00:55:05.735216 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"efeb5e5d-c3a7-4b40-8903-a36ee311f97c","Type":"ContainerStarted","Data":"11f5becedfabf10d62bc28615e4d69d12672ea59d653ece769ee5168fbe0d38d"} Mar 08 00:55:05.846796 master-0 kubenswrapper[23041]: I0308 00:55:05.846695 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:05.858795 master-0 kubenswrapper[23041]: I0308 00:55:05.858458 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:06.132521 master-0 kubenswrapper[23041]: I0308 00:55:06.129127 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7wljq"] Mar 08 00:55:06.142272 master-0 kubenswrapper[23041]: I0308 00:55:06.140872 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:55:06.751802 master-0 kubenswrapper[23041]: I0308 00:55:06.751163 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerStarted","Data":"6a0b0bd08ea69eceafa2b58b3d64f364f44b8a85b9338f9f3627f16bbb2b6eac"} Mar 08 00:55:06.767299 master-0 kubenswrapper[23041]: I0308 00:55:06.767046 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a58922b-c6fd-4668-8b74-674a1cb13323","Type":"ContainerStarted","Data":"83433891b067770ebd5fe5b882a6dc4717dad3359aabe20388914ed3611ca029"} Mar 08 00:55:06.772398 master-0 kubenswrapper[23041]: I0308 00:55:06.772342 23041 generic.go:334] "Generic (PLEG): container finished" podID="7a7954fa-b332-42f8-8a94-671129de12dc" containerID="51ff2a101385007bfd5b9739ab14f4a18b93977cf894baa68480ce356d4b7596" exitCode=0 Mar 08 00:55:06.772617 master-0 kubenswrapper[23041]: I0308 00:55:06.772415 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" event={"ID":"7a7954fa-b332-42f8-8a94-671129de12dc","Type":"ContainerDied","Data":"51ff2a101385007bfd5b9739ab14f4a18b93977cf894baa68480ce356d4b7596"} Mar 08 00:55:06.772617 master-0 kubenswrapper[23041]: I0308 00:55:06.772447 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" event={"ID":"7a7954fa-b332-42f8-8a94-671129de12dc","Type":"ContainerStarted","Data":"6c840f10982004b152ee8a7160be32f5bd91a8ddba6061df2f43d7a5ffe75a93"} Mar 08 00:55:06.777657 master-0 kubenswrapper[23041]: I0308 00:55:06.777547 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7wljq" event={"ID":"993b30b0-5927-4e4b-8945-6586313e285f","Type":"ContainerStarted","Data":"10b8342090b2046b33c098540fd76b13e14f5850eebcf56c85da9203d97b733d"} Mar 08 00:55:06.777657 master-0 kubenswrapper[23041]: I0308 00:55:06.777604 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7wljq" event={"ID":"993b30b0-5927-4e4b-8945-6586313e285f","Type":"ContainerStarted","Data":"c2f26e2e10e76e9c5ade58d90b778b3eccf96643385dc3d7955f9cea49719e5c"} Mar 08 00:55:06.850127 master-0 kubenswrapper[23041]: I0308 00:55:06.850041 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7wljq" podStartSLOduration=2.85001673 podStartE2EDuration="2.85001673s" podCreationTimestamp="2026-03-08 00:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:06.834424619 +0000 UTC m=+1412.307261203" watchObservedRunningTime="2026-03-08 00:55:06.85001673 +0000 UTC m=+1412.322853284" Mar 08 00:55:08.934370 master-0 kubenswrapper[23041]: I0308 00:55:08.929350 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:08.934370 master-0 kubenswrapper[23041]: I0308 00:55:08.933217 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:10.950221 master-0 kubenswrapper[23041]: I0308 00:55:10.949807 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" event={"ID":"7a7954fa-b332-42f8-8a94-671129de12dc","Type":"ContainerStarted","Data":"622898ca42b3f6734dec3cebdc95676982199ddc85f862ecb5471f61fb68047e"} Mar 08 00:55:10.954225 master-0 kubenswrapper[23041]: I0308 00:55:10.951185 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:10.971222 master-0 kubenswrapper[23041]: I0308 00:55:10.967350 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerStarted","Data":"bb665cfb84bc4238ce4c3fbf323d934c308aa657e54c21f960c2f71b319f0afb"} Mar 08 00:55:10.971222 master-0 kubenswrapper[23041]: I0308 00:55:10.967411 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerStarted","Data":"5bef13fadb6620ae32a22db62726707c1135c45202f998789ff9f4806c2a6c6b"} Mar 08 00:55:10.976216 master-0 kubenswrapper[23041]: I0308 00:55:10.973995 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"efeb5e5d-c3a7-4b40-8903-a36ee311f97c","Type":"ContainerStarted","Data":"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb"} Mar 08 00:55:10.982989 master-0 kubenswrapper[23041]: I0308 00:55:10.979290 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerStarted","Data":"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5"} Mar 08 00:55:10.982989 master-0 kubenswrapper[23041]: I0308 00:55:10.979350 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerStarted","Data":"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8"} Mar 08 00:55:10.982989 master-0 kubenswrapper[23041]: I0308 00:55:10.979472 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-log" containerID="cri-o://bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" gracePeriod=30 Mar 08 00:55:10.982989 master-0 kubenswrapper[23041]: I0308 00:55:10.979809 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-metadata" containerID="cri-o://b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" gracePeriod=30 Mar 08 00:55:10.986695 master-0 kubenswrapper[23041]: I0308 00:55:10.985460 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a58922b-c6fd-4668-8b74-674a1cb13323","Type":"ContainerStarted","Data":"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287"} Mar 08 00:55:10.986695 master-0 kubenswrapper[23041]: I0308 00:55:10.985603 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7a58922b-c6fd-4668-8b74-674a1cb13323" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287" gracePeriod=30 Mar 08 00:55:11.011233 master-0 kubenswrapper[23041]: I0308 00:55:11.010708 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" podStartSLOduration=7.010686118 podStartE2EDuration="7.010686118s" podCreationTimestamp="2026-03-08 00:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:11.003597225 +0000 UTC m=+1416.476433789" watchObservedRunningTime="2026-03-08 00:55:11.010686118 +0000 UTC m=+1416.483522672" Mar 08 00:55:11.047224 master-0 kubenswrapper[23041]: I0308 00:55:11.046850 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.852626525 podStartE2EDuration="8.046828952s" podCreationTimestamp="2026-03-08 00:55:03 +0000 UTC" firstStartedPulling="2026-03-08 00:55:05.523430251 +0000 UTC m=+1410.996266795" lastFinishedPulling="2026-03-08 00:55:09.717632668 +0000 UTC m=+1415.190469222" observedRunningTime="2026-03-08 00:55:11.039657136 +0000 UTC m=+1416.512493690" watchObservedRunningTime="2026-03-08 00:55:11.046828952 +0000 UTC m=+1416.519665506" Mar 08 00:55:11.074221 master-0 kubenswrapper[23041]: I0308 00:55:11.072267 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.169852117 podStartE2EDuration="8.072248703s" podCreationTimestamp="2026-03-08 00:55:03 +0000 UTC" firstStartedPulling="2026-03-08 00:55:05.814963956 +0000 UTC m=+1411.287800510" lastFinishedPulling="2026-03-08 00:55:09.717360542 +0000 UTC m=+1415.190197096" observedRunningTime="2026-03-08 00:55:11.065136959 +0000 UTC m=+1416.537973513" watchObservedRunningTime="2026-03-08 00:55:11.072248703 +0000 UTC m=+1416.545085257" Mar 08 00:55:11.099262 master-0 kubenswrapper[23041]: I0308 00:55:11.097934 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.902803921 podStartE2EDuration="8.09791262s" podCreationTimestamp="2026-03-08 00:55:03 +0000 UTC" firstStartedPulling="2026-03-08 00:55:05.523309788 +0000 UTC m=+1410.996146332" lastFinishedPulling="2026-03-08 00:55:09.718418477 +0000 UTC m=+1415.191255031" observedRunningTime="2026-03-08 00:55:11.093798389 +0000 UTC m=+1416.566634963" watchObservedRunningTime="2026-03-08 00:55:11.09791262 +0000 UTC m=+1416.570749174" Mar 08 00:55:11.124212 master-0 kubenswrapper[23041]: I0308 00:55:11.121623 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.20963361 podStartE2EDuration="8.121603009s" podCreationTimestamp="2026-03-08 00:55:03 +0000 UTC" firstStartedPulling="2026-03-08 00:55:05.81512268 +0000 UTC m=+1411.287959234" lastFinishedPulling="2026-03-08 00:55:09.727092079 +0000 UTC m=+1415.199928633" observedRunningTime="2026-03-08 00:55:11.116430002 +0000 UTC m=+1416.589266556" watchObservedRunningTime="2026-03-08 00:55:11.121603009 +0000 UTC m=+1416.594439563" Mar 08 00:55:11.696507 master-0 kubenswrapper[23041]: I0308 00:55:11.688458 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:11.771235 master-0 kubenswrapper[23041]: I0308 00:55:11.766470 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvdpv\" (UniqueName: \"kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv\") pod \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " Mar 08 00:55:11.771235 master-0 kubenswrapper[23041]: I0308 00:55:11.766754 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle\") pod \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " Mar 08 00:55:11.771235 master-0 kubenswrapper[23041]: I0308 00:55:11.766834 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data\") pod \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " Mar 08 00:55:11.771235 master-0 kubenswrapper[23041]: I0308 00:55:11.766862 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs\") pod \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\" (UID: \"d3ba71b2-4318-4f20-99a5-e360a59ae1d8\") " Mar 08 00:55:11.771235 master-0 kubenswrapper[23041]: I0308 00:55:11.770043 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs" (OuterVolumeSpecName: "logs") pod "d3ba71b2-4318-4f20-99a5-e360a59ae1d8" (UID: "d3ba71b2-4318-4f20-99a5-e360a59ae1d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:55:11.787335 master-0 kubenswrapper[23041]: I0308 00:55:11.783519 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv" (OuterVolumeSpecName: "kube-api-access-kvdpv") pod "d3ba71b2-4318-4f20-99a5-e360a59ae1d8" (UID: "d3ba71b2-4318-4f20-99a5-e360a59ae1d8"). InnerVolumeSpecName "kube-api-access-kvdpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:11.837679 master-0 kubenswrapper[23041]: I0308 00:55:11.837517 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3ba71b2-4318-4f20-99a5-e360a59ae1d8" (UID: "d3ba71b2-4318-4f20-99a5-e360a59ae1d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:11.850704 master-0 kubenswrapper[23041]: I0308 00:55:11.850613 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data" (OuterVolumeSpecName: "config-data") pod "d3ba71b2-4318-4f20-99a5-e360a59ae1d8" (UID: "d3ba71b2-4318-4f20-99a5-e360a59ae1d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:11.871566 master-0 kubenswrapper[23041]: I0308 00:55:11.871480 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvdpv\" (UniqueName: \"kubernetes.io/projected/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-kube-api-access-kvdpv\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:11.871566 master-0 kubenswrapper[23041]: I0308 00:55:11.871564 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:11.871566 master-0 kubenswrapper[23041]: I0308 00:55:11.871577 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:11.871566 master-0 kubenswrapper[23041]: I0308 00:55:11.871587 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ba71b2-4318-4f20-99a5-e360a59ae1d8-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:11.999880 master-0 kubenswrapper[23041]: I0308 00:55:11.999820 23041 generic.go:334] "Generic (PLEG): container finished" podID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerID="b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" exitCode=0 Mar 08 00:55:11.999880 master-0 kubenswrapper[23041]: I0308 00:55:11.999864 23041 generic.go:334] "Generic (PLEG): container finished" podID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerID="bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" exitCode=143 Mar 08 00:55:12.001988 master-0 kubenswrapper[23041]: I0308 00:55:12.001963 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:12.007430 master-0 kubenswrapper[23041]: I0308 00:55:12.007361 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerDied","Data":"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5"} Mar 08 00:55:12.007640 master-0 kubenswrapper[23041]: I0308 00:55:12.007624 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerDied","Data":"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8"} Mar 08 00:55:12.007716 master-0 kubenswrapper[23041]: I0308 00:55:12.007702 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3ba71b2-4318-4f20-99a5-e360a59ae1d8","Type":"ContainerDied","Data":"6a0b0bd08ea69eceafa2b58b3d64f364f44b8a85b9338f9f3627f16bbb2b6eac"} Mar 08 00:55:12.007783 master-0 kubenswrapper[23041]: I0308 00:55:12.007671 23041 scope.go:117] "RemoveContainer" containerID="b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" Mar 08 00:55:12.059706 master-0 kubenswrapper[23041]: I0308 00:55:12.059595 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:12.060801 master-0 kubenswrapper[23041]: I0308 00:55:12.060614 23041 scope.go:117] "RemoveContainer" containerID="bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" Mar 08 00:55:12.122878 master-0 kubenswrapper[23041]: I0308 00:55:12.121921 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:12.144993 master-0 kubenswrapper[23041]: I0308 00:55:12.143939 23041 scope.go:117] "RemoveContainer" containerID="b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" Mar 08 00:55:12.144993 master-0 kubenswrapper[23041]: E0308 00:55:12.144781 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5\": container with ID starting with b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5 not found: ID does not exist" containerID="b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" Mar 08 00:55:12.144993 master-0 kubenswrapper[23041]: I0308 00:55:12.144867 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5"} err="failed to get container status \"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5\": rpc error: code = NotFound desc = could not find container \"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5\": container with ID starting with b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5 not found: ID does not exist" Mar 08 00:55:12.144993 master-0 kubenswrapper[23041]: I0308 00:55:12.144918 23041 scope.go:117] "RemoveContainer" containerID="bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" Mar 08 00:55:12.151320 master-0 kubenswrapper[23041]: E0308 00:55:12.151029 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8\": container with ID starting with bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8 not found: ID does not exist" containerID="bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" Mar 08 00:55:12.151320 master-0 kubenswrapper[23041]: I0308 00:55:12.151113 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8"} err="failed to get container status \"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8\": rpc error: code = NotFound desc = could not find container \"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8\": container with ID starting with bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8 not found: ID does not exist" Mar 08 00:55:12.151320 master-0 kubenswrapper[23041]: I0308 00:55:12.151147 23041 scope.go:117] "RemoveContainer" containerID="b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5" Mar 08 00:55:12.157411 master-0 kubenswrapper[23041]: I0308 00:55:12.157369 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:12.158400 master-0 kubenswrapper[23041]: E0308 00:55:12.158366 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-metadata" Mar 08 00:55:12.158400 master-0 kubenswrapper[23041]: I0308 00:55:12.158397 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-metadata" Mar 08 00:55:12.158486 master-0 kubenswrapper[23041]: E0308 00:55:12.158413 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-log" Mar 08 00:55:12.158486 master-0 kubenswrapper[23041]: I0308 00:55:12.158421 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-log" Mar 08 00:55:12.158752 master-0 kubenswrapper[23041]: I0308 00:55:12.158723 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-log" Mar 08 00:55:12.158752 master-0 kubenswrapper[23041]: I0308 00:55:12.158748 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" containerName="nova-metadata-metadata" Mar 08 00:55:12.160635 master-0 kubenswrapper[23041]: I0308 00:55:12.160454 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:12.165384 master-0 kubenswrapper[23041]: I0308 00:55:12.165307 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5"} err="failed to get container status \"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5\": rpc error: code = NotFound desc = could not find container \"b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5\": container with ID starting with b662b4fc2558c0a5aaf9a11d455ff7abe4f3ee95581d63809764d63c414e53b5 not found: ID does not exist" Mar 08 00:55:12.165384 master-0 kubenswrapper[23041]: I0308 00:55:12.165380 23041 scope.go:117] "RemoveContainer" containerID="bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8" Mar 08 00:55:12.166015 master-0 kubenswrapper[23041]: I0308 00:55:12.165963 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 00:55:12.166380 master-0 kubenswrapper[23041]: I0308 00:55:12.166353 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 00:55:12.167870 master-0 kubenswrapper[23041]: I0308 00:55:12.167831 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8"} err="failed to get container status \"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8\": rpc error: code = NotFound desc = could not find container \"bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8\": container with ID starting with bd98b716d375b4eee6105b80b76a14cd53256e5157ea7f68a66c71e8ab58a8c8 not found: ID does not exist" Mar 08 00:55:12.187229 master-0 kubenswrapper[23041]: I0308 00:55:12.187160 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:12.289782 master-0 kubenswrapper[23041]: I0308 00:55:12.289713 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.289782 master-0 kubenswrapper[23041]: I0308 00:55:12.289780 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pcs\" (UniqueName: \"kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.290100 master-0 kubenswrapper[23041]: I0308 00:55:12.289834 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.290100 master-0 kubenswrapper[23041]: I0308 00:55:12.289870 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.290100 master-0 kubenswrapper[23041]: I0308 00:55:12.289926 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.393939 master-0 kubenswrapper[23041]: I0308 00:55:12.393770 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.394154 master-0 kubenswrapper[23041]: I0308 00:55:12.393969 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pcs\" (UniqueName: \"kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.394471 master-0 kubenswrapper[23041]: I0308 00:55:12.394432 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.396609 master-0 kubenswrapper[23041]: I0308 00:55:12.396516 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.397550 master-0 kubenswrapper[23041]: I0308 00:55:12.397512 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.397816 master-0 kubenswrapper[23041]: I0308 00:55:12.397777 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.402797 master-0 kubenswrapper[23041]: I0308 00:55:12.402748 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.402951 master-0 kubenswrapper[23041]: I0308 00:55:12.402934 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.404852 master-0 kubenswrapper[23041]: I0308 00:55:12.404796 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.420585 master-0 kubenswrapper[23041]: I0308 00:55:12.420533 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pcs\" (UniqueName: \"kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs\") pod \"nova-metadata-0\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " pod="openstack/nova-metadata-0" Mar 08 00:55:12.492478 master-0 kubenswrapper[23041]: I0308 00:55:12.492396 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:12.864629 master-0 kubenswrapper[23041]: I0308 00:55:12.864573 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ba71b2-4318-4f20-99a5-e360a59ae1d8" path="/var/lib/kubelet/pods/d3ba71b2-4318-4f20-99a5-e360a59ae1d8/volumes" Mar 08 00:55:13.008030 master-0 kubenswrapper[23041]: I0308 00:55:13.007737 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:13.019610 master-0 kubenswrapper[23041]: W0308 00:55:13.019554 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod366c5ae2_34a4_413e_8f57_bcbdb57e73d5.slice/crio-08e4b7c6d203bec79ab3c2f47d0632f6d09feb76e5c2e54976899d80232b3642 WatchSource:0}: Error finding container 08e4b7c6d203bec79ab3c2f47d0632f6d09feb76e5c2e54976899d80232b3642: Status 404 returned error can't find the container with id 08e4b7c6d203bec79ab3c2f47d0632f6d09feb76e5c2e54976899d80232b3642 Mar 08 00:55:14.038305 master-0 kubenswrapper[23041]: I0308 00:55:14.038246 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerStarted","Data":"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3"} Mar 08 00:55:14.038305 master-0 kubenswrapper[23041]: I0308 00:55:14.038301 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerStarted","Data":"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c"} Mar 08 00:55:14.038305 master-0 kubenswrapper[23041]: I0308 00:55:14.038314 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerStarted","Data":"08e4b7c6d203bec79ab3c2f47d0632f6d09feb76e5c2e54976899d80232b3642"} Mar 08 00:55:14.040292 master-0 kubenswrapper[23041]: I0308 00:55:14.040257 23041 generic.go:334] "Generic (PLEG): container finished" podID="6d9dfee3-973a-4663-9df5-1ea29d47096a" containerID="839f13e1d10ecbdb92ec49006f729adb8f2d656e4b95322e80df18e0b6744acd" exitCode=0 Mar 08 00:55:14.040433 master-0 kubenswrapper[23041]: I0308 00:55:14.040417 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xkrf7" event={"ID":"6d9dfee3-973a-4663-9df5-1ea29d47096a","Type":"ContainerDied","Data":"839f13e1d10ecbdb92ec49006f729adb8f2d656e4b95322e80df18e0b6744acd"} Mar 08 00:55:14.228321 master-0 kubenswrapper[23041]: I0308 00:55:14.227180 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.227153722 podStartE2EDuration="2.227153722s" podCreationTimestamp="2026-03-08 00:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:14.192939506 +0000 UTC m=+1419.665776050" watchObservedRunningTime="2026-03-08 00:55:14.227153722 +0000 UTC m=+1419.699990276" Mar 08 00:55:14.386617 master-0 kubenswrapper[23041]: I0308 00:55:14.384254 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:55:14.386617 master-0 kubenswrapper[23041]: I0308 00:55:14.384719 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:55:14.391176 master-0 kubenswrapper[23041]: I0308 00:55:14.389940 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 00:55:14.391176 master-0 kubenswrapper[23041]: I0308 00:55:14.389981 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 00:55:14.438347 master-0 kubenswrapper[23041]: I0308 00:55:14.437720 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 00:55:15.035818 master-0 kubenswrapper[23041]: I0308 00:55:15.035748 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:15.109886 master-0 kubenswrapper[23041]: I0308 00:55:15.109725 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:55:15.119399 master-0 kubenswrapper[23041]: I0308 00:55:15.118924 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 00:55:15.270377 master-0 kubenswrapper[23041]: I0308 00:55:15.265386 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:55:15.307411 master-0 kubenswrapper[23041]: I0308 00:55:15.307277 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="dnsmasq-dns" containerID="cri-o://afb5824a1f58d2d4b1c3ce13f0c9fca6eafa2ca4d4e874bf24d094c4b9522d26" gracePeriod=10 Mar 08 00:55:15.469369 master-0 kubenswrapper[23041]: I0308 00:55:15.468554 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:15.469369 master-0 kubenswrapper[23041]: I0308 00:55:15.468887 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:17.494295 master-0 kubenswrapper[23041]: I0308 00:55:17.493429 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:55:17.494295 master-0 kubenswrapper[23041]: I0308 00:55:17.493495 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:55:20.083179 master-0 kubenswrapper[23041]: I0308 00:55:20.083113 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.251:5353: connect: connection refused" Mar 08 00:55:21.176743 master-0 kubenswrapper[23041]: I0308 00:55:21.176628 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xkrf7" event={"ID":"6d9dfee3-973a-4663-9df5-1ea29d47096a","Type":"ContainerDied","Data":"c1451ea855ff6aaf7b1095ac17d89ccea09a6224679e4ff61e15b2e63c4f9020"} Mar 08 00:55:21.176743 master-0 kubenswrapper[23041]: I0308 00:55:21.176692 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1451ea855ff6aaf7b1095ac17d89ccea09a6224679e4ff61e15b2e63c4f9020" Mar 08 00:55:21.180223 master-0 kubenswrapper[23041]: I0308 00:55:21.180177 23041 generic.go:334] "Generic (PLEG): container finished" podID="184ea98e-2eac-49e3-a090-c412857c1df4" containerID="afb5824a1f58d2d4b1c3ce13f0c9fca6eafa2ca4d4e874bf24d094c4b9522d26" exitCode=0 Mar 08 00:55:21.180324 master-0 kubenswrapper[23041]: I0308 00:55:21.180278 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" event={"ID":"184ea98e-2eac-49e3-a090-c412857c1df4","Type":"ContainerDied","Data":"afb5824a1f58d2d4b1c3ce13f0c9fca6eafa2ca4d4e874bf24d094c4b9522d26"} Mar 08 00:55:21.182742 master-0 kubenswrapper[23041]: I0308 00:55:21.182708 23041 generic.go:334] "Generic (PLEG): container finished" podID="993b30b0-5927-4e4b-8945-6586313e285f" containerID="10b8342090b2046b33c098540fd76b13e14f5850eebcf56c85da9203d97b733d" exitCode=0 Mar 08 00:55:21.182831 master-0 kubenswrapper[23041]: I0308 00:55:21.182745 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7wljq" event={"ID":"993b30b0-5927-4e4b-8945-6586313e285f","Type":"ContainerDied","Data":"10b8342090b2046b33c098540fd76b13e14f5850eebcf56c85da9203d97b733d"} Mar 08 00:55:21.367173 master-0 kubenswrapper[23041]: I0308 00:55:21.366688 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:21.518519 master-0 kubenswrapper[23041]: I0308 00:55:21.518470 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data\") pod \"6d9dfee3-973a-4663-9df5-1ea29d47096a\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " Mar 08 00:55:21.518854 master-0 kubenswrapper[23041]: I0308 00:55:21.518741 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts\") pod \"6d9dfee3-973a-4663-9df5-1ea29d47096a\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " Mar 08 00:55:21.518922 master-0 kubenswrapper[23041]: I0308 00:55:21.518903 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm7ss\" (UniqueName: \"kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss\") pod \"6d9dfee3-973a-4663-9df5-1ea29d47096a\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " Mar 08 00:55:21.519119 master-0 kubenswrapper[23041]: I0308 00:55:21.519009 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle\") pod \"6d9dfee3-973a-4663-9df5-1ea29d47096a\" (UID: \"6d9dfee3-973a-4663-9df5-1ea29d47096a\") " Mar 08 00:55:21.522765 master-0 kubenswrapper[23041]: I0308 00:55:21.522710 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts" (OuterVolumeSpecName: "scripts") pod "6d9dfee3-973a-4663-9df5-1ea29d47096a" (UID: "6d9dfee3-973a-4663-9df5-1ea29d47096a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:21.523404 master-0 kubenswrapper[23041]: I0308 00:55:21.523374 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss" (OuterVolumeSpecName: "kube-api-access-zm7ss") pod "6d9dfee3-973a-4663-9df5-1ea29d47096a" (UID: "6d9dfee3-973a-4663-9df5-1ea29d47096a"). InnerVolumeSpecName "kube-api-access-zm7ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:21.533321 master-0 kubenswrapper[23041]: I0308 00:55:21.533264 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:55:21.547495 master-0 kubenswrapper[23041]: I0308 00:55:21.547420 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d9dfee3-973a-4663-9df5-1ea29d47096a" (UID: "6d9dfee3-973a-4663-9df5-1ea29d47096a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:21.552568 master-0 kubenswrapper[23041]: I0308 00:55:21.552508 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data" (OuterVolumeSpecName: "config-data") pod "6d9dfee3-973a-4663-9df5-1ea29d47096a" (UID: "6d9dfee3-973a-4663-9df5-1ea29d47096a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:21.622694 master-0 kubenswrapper[23041]: I0308 00:55:21.622112 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm7ss\" (UniqueName: \"kubernetes.io/projected/6d9dfee3-973a-4663-9df5-1ea29d47096a-kube-api-access-zm7ss\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.622694 master-0 kubenswrapper[23041]: I0308 00:55:21.622156 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.622694 master-0 kubenswrapper[23041]: I0308 00:55:21.622167 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.622694 master-0 kubenswrapper[23041]: I0308 00:55:21.622175 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9dfee3-973a-4663-9df5-1ea29d47096a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.724133 master-0 kubenswrapper[23041]: I0308 00:55:21.723835 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.724133 master-0 kubenswrapper[23041]: I0308 00:55:21.723953 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.724133 master-0 kubenswrapper[23041]: I0308 00:55:21.724026 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.724133 master-0 kubenswrapper[23041]: I0308 00:55:21.724125 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.724776 master-0 kubenswrapper[23041]: I0308 00:55:21.724238 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.724776 master-0 kubenswrapper[23041]: I0308 00:55:21.724300 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwnsz\" (UniqueName: \"kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz\") pod \"184ea98e-2eac-49e3-a090-c412857c1df4\" (UID: \"184ea98e-2eac-49e3-a090-c412857c1df4\") " Mar 08 00:55:21.729562 master-0 kubenswrapper[23041]: I0308 00:55:21.729435 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz" (OuterVolumeSpecName: "kube-api-access-gwnsz") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "kube-api-access-gwnsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:21.780056 master-0 kubenswrapper[23041]: I0308 00:55:21.779995 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:55:21.780501 master-0 kubenswrapper[23041]: I0308 00:55:21.780434 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config" (OuterVolumeSpecName: "config") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:55:21.781241 master-0 kubenswrapper[23041]: I0308 00:55:21.781164 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:55:21.787364 master-0 kubenswrapper[23041]: I0308 00:55:21.787322 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:55:21.789772 master-0 kubenswrapper[23041]: I0308 00:55:21.789742 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "184ea98e-2eac-49e3-a090-c412857c1df4" (UID: "184ea98e-2eac-49e3-a090-c412857c1df4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:55:21.827420 master-0 kubenswrapper[23041]: I0308 00:55:21.827180 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.827420 master-0 kubenswrapper[23041]: I0308 00:55:21.827413 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwnsz\" (UniqueName: \"kubernetes.io/projected/184ea98e-2eac-49e3-a090-c412857c1df4-kube-api-access-gwnsz\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.827420 master-0 kubenswrapper[23041]: I0308 00:55:21.827427 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.827665 master-0 kubenswrapper[23041]: I0308 00:55:21.827438 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.827665 master-0 kubenswrapper[23041]: I0308 00:55:21.827446 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:21.827665 master-0 kubenswrapper[23041]: I0308 00:55:21.827454 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/184ea98e-2eac-49e3-a090-c412857c1df4-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:22.197872 master-0 kubenswrapper[23041]: I0308 00:55:22.197798 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" event={"ID":"184ea98e-2eac-49e3-a090-c412857c1df4","Type":"ContainerDied","Data":"615f4dea517bb3e594bd2aedbe25561b6e130738bac71939d302794616d32ba2"} Mar 08 00:55:22.197872 master-0 kubenswrapper[23041]: I0308 00:55:22.197828 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77475956d7-pp5mp" Mar 08 00:55:22.197872 master-0 kubenswrapper[23041]: I0308 00:55:22.197856 23041 scope.go:117] "RemoveContainer" containerID="afb5824a1f58d2d4b1c3ce13f0c9fca6eafa2ca4d4e874bf24d094c4b9522d26" Mar 08 00:55:22.204249 master-0 kubenswrapper[23041]: I0308 00:55:22.204127 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"4f052268-95a2-4ee5-9955-0e851ca8894a","Type":"ContainerStarted","Data":"6c9c10845d49c9334b0d84aa678a87f69a9634a891c56508bd567acf1b9e3216"} Mar 08 00:55:22.204882 master-0 kubenswrapper[23041]: I0308 00:55:22.204826 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:22.210229 master-0 kubenswrapper[23041]: I0308 00:55:22.210131 23041 generic.go:334] "Generic (PLEG): container finished" podID="5fd31740-3478-41e5-8295-d4b50f40db04" containerID="8743ecc8e441a1be02a5a33ce6e46345de67d9b5a580d996ad8b953f8e919372" exitCode=0 Mar 08 00:55:22.210551 master-0 kubenswrapper[23041]: I0308 00:55:22.210233 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerDied","Data":"8743ecc8e441a1be02a5a33ce6e46345de67d9b5a580d996ad8b953f8e919372"} Mar 08 00:55:22.210551 master-0 kubenswrapper[23041]: I0308 00:55:22.210256 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xkrf7" Mar 08 00:55:22.241816 master-0 kubenswrapper[23041]: I0308 00:55:22.241682 23041 scope.go:117] "RemoveContainer" containerID="70109aebc0714cb899cf3f07e744f45a361919fa1862191281a75cf168b540e2" Mar 08 00:55:22.242965 master-0 kubenswrapper[23041]: I0308 00:55:22.242873 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=5.072265534 podStartE2EDuration="21.24284931s" podCreationTimestamp="2026-03-08 00:55:01 +0000 UTC" firstStartedPulling="2026-03-08 00:55:05.055587708 +0000 UTC m=+1410.528424262" lastFinishedPulling="2026-03-08 00:55:21.226171474 +0000 UTC m=+1426.699008038" observedRunningTime="2026-03-08 00:55:22.227348732 +0000 UTC m=+1427.700185306" watchObservedRunningTime="2026-03-08 00:55:22.24284931 +0000 UTC m=+1427.715685864" Mar 08 00:55:22.248270 master-0 kubenswrapper[23041]: I0308 00:55:22.248191 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 00:55:22.446348 master-0 kubenswrapper[23041]: I0308 00:55:22.446260 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:55:22.497176 master-0 kubenswrapper[23041]: I0308 00:55:22.497118 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:55:22.497176 master-0 kubenswrapper[23041]: I0308 00:55:22.497178 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77475956d7-pp5mp"] Mar 08 00:55:22.498999 master-0 kubenswrapper[23041]: I0308 00:55:22.498960 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:55:22.637492 master-0 kubenswrapper[23041]: I0308 00:55:22.629831 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:22.637492 master-0 kubenswrapper[23041]: I0308 00:55:22.630049 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-log" containerID="cri-o://5bef13fadb6620ae32a22db62726707c1135c45202f998789ff9f4806c2a6c6b" gracePeriod=30 Mar 08 00:55:22.637492 master-0 kubenswrapper[23041]: I0308 00:55:22.630606 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-api" containerID="cri-o://bb665cfb84bc4238ce4c3fbf323d934c308aa657e54c21f960c2f71b319f0afb" gracePeriod=30 Mar 08 00:55:22.651594 master-0 kubenswrapper[23041]: I0308 00:55:22.647886 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:22.651594 master-0 kubenswrapper[23041]: I0308 00:55:22.648105 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerName="nova-scheduler-scheduler" containerID="cri-o://ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" gracePeriod=30 Mar 08 00:55:22.753229 master-0 kubenswrapper[23041]: I0308 00:55:22.752302 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:22.792230 master-0 kubenswrapper[23041]: I0308 00:55:22.786193 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:22.852729 master-0 kubenswrapper[23041]: I0308 00:55:22.850985 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" path="/var/lib/kubelet/pods/184ea98e-2eac-49e3-a090-c412857c1df4/volumes" Mar 08 00:55:22.907252 master-0 kubenswrapper[23041]: I0308 00:55:22.903121 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle\") pod \"993b30b0-5927-4e4b-8945-6586313e285f\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " Mar 08 00:55:22.907252 master-0 kubenswrapper[23041]: I0308 00:55:22.903256 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data\") pod \"993b30b0-5927-4e4b-8945-6586313e285f\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " Mar 08 00:55:22.907252 master-0 kubenswrapper[23041]: I0308 00:55:22.903409 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts\") pod \"993b30b0-5927-4e4b-8945-6586313e285f\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " Mar 08 00:55:22.907252 master-0 kubenswrapper[23041]: I0308 00:55:22.903449 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lmth\" (UniqueName: \"kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth\") pod \"993b30b0-5927-4e4b-8945-6586313e285f\" (UID: \"993b30b0-5927-4e4b-8945-6586313e285f\") " Mar 08 00:55:22.916339 master-0 kubenswrapper[23041]: I0308 00:55:22.910339 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts" (OuterVolumeSpecName: "scripts") pod "993b30b0-5927-4e4b-8945-6586313e285f" (UID: "993b30b0-5927-4e4b-8945-6586313e285f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:22.916339 master-0 kubenswrapper[23041]: I0308 00:55:22.910436 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth" (OuterVolumeSpecName: "kube-api-access-5lmth") pod "993b30b0-5927-4e4b-8945-6586313e285f" (UID: "993b30b0-5927-4e4b-8945-6586313e285f"). InnerVolumeSpecName "kube-api-access-5lmth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:22.940750 master-0 kubenswrapper[23041]: I0308 00:55:22.940708 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data" (OuterVolumeSpecName: "config-data") pod "993b30b0-5927-4e4b-8945-6586313e285f" (UID: "993b30b0-5927-4e4b-8945-6586313e285f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:22.949759 master-0 kubenswrapper[23041]: I0308 00:55:22.948452 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "993b30b0-5927-4e4b-8945-6586313e285f" (UID: "993b30b0-5927-4e4b-8945-6586313e285f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:23.005873 master-0 kubenswrapper[23041]: I0308 00:55:23.005811 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:23.005873 master-0 kubenswrapper[23041]: I0308 00:55:23.005865 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lmth\" (UniqueName: \"kubernetes.io/projected/993b30b0-5927-4e4b-8945-6586313e285f-kube-api-access-5lmth\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:23.005873 master-0 kubenswrapper[23041]: I0308 00:55:23.005877 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:23.005873 master-0 kubenswrapper[23041]: I0308 00:55:23.005885 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993b30b0-5927-4e4b-8945-6586313e285f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:23.272444 master-0 kubenswrapper[23041]: I0308 00:55:23.268940 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"61ca9fbdbbb4ae23bc308579e92bb6c5c7d0ebe2123d69c6c8202da1512ee45e"} Mar 08 00:55:23.285828 master-0 kubenswrapper[23041]: I0308 00:55:23.285753 23041 generic.go:334] "Generic (PLEG): container finished" podID="988af452-10f3-4d0b-806e-7f91af9135a0" containerID="5bef13fadb6620ae32a22db62726707c1135c45202f998789ff9f4806c2a6c6b" exitCode=143 Mar 08 00:55:23.286029 master-0 kubenswrapper[23041]: I0308 00:55:23.285891 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerDied","Data":"5bef13fadb6620ae32a22db62726707c1135c45202f998789ff9f4806c2a6c6b"} Mar 08 00:55:23.313253 master-0 kubenswrapper[23041]: I0308 00:55:23.312434 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7wljq" Mar 08 00:55:23.314582 master-0 kubenswrapper[23041]: I0308 00:55:23.313795 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7wljq" event={"ID":"993b30b0-5927-4e4b-8945-6586313e285f","Type":"ContainerDied","Data":"c2f26e2e10e76e9c5ade58d90b778b3eccf96643385dc3d7955f9cea49719e5c"} Mar 08 00:55:23.314582 master-0 kubenswrapper[23041]: I0308 00:55:23.313879 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f26e2e10e76e9c5ade58d90b778b3eccf96643385dc3d7955f9cea49719e5c" Mar 08 00:55:23.330068 master-0 kubenswrapper[23041]: I0308 00:55:23.329645 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: E0308 00:55:23.330425 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="dnsmasq-dns" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.330520 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="dnsmasq-dns" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: E0308 00:55:23.330581 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="init" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.330593 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="init" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: E0308 00:55:23.330659 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993b30b0-5927-4e4b-8945-6586313e285f" containerName="nova-cell1-conductor-db-sync" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.330670 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="993b30b0-5927-4e4b-8945-6586313e285f" containerName="nova-cell1-conductor-db-sync" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: E0308 00:55:23.330688 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9dfee3-973a-4663-9df5-1ea29d47096a" containerName="nova-manage" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.330697 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9dfee3-973a-4663-9df5-1ea29d47096a" containerName="nova-manage" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.331017 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="993b30b0-5927-4e4b-8945-6586313e285f" containerName="nova-cell1-conductor-db-sync" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.331058 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9dfee3-973a-4663-9df5-1ea29d47096a" containerName="nova-manage" Mar 08 00:55:23.331819 master-0 kubenswrapper[23041]: I0308 00:55:23.331090 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="184ea98e-2eac-49e3-a090-c412857c1df4" containerName="dnsmasq-dns" Mar 08 00:55:23.332215 master-0 kubenswrapper[23041]: I0308 00:55:23.332154 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.335887 master-0 kubenswrapper[23041]: I0308 00:55:23.335750 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 00:55:23.374542 master-0 kubenswrapper[23041]: I0308 00:55:23.368228 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 00:55:23.509293 master-0 kubenswrapper[23041]: I0308 00:55:23.509224 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:23.509708 master-0 kubenswrapper[23041]: I0308 00:55:23.509230 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:23.520546 master-0 kubenswrapper[23041]: I0308 00:55:23.520009 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.520546 master-0 kubenswrapper[23041]: I0308 00:55:23.520165 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58cmz\" (UniqueName: \"kubernetes.io/projected/acbcd7de-972e-4a55-9847-24760df26ddf-kube-api-access-58cmz\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.520546 master-0 kubenswrapper[23041]: I0308 00:55:23.520261 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.622573 master-0 kubenswrapper[23041]: I0308 00:55:23.622515 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.622827 master-0 kubenswrapper[23041]: I0308 00:55:23.622668 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58cmz\" (UniqueName: \"kubernetes.io/projected/acbcd7de-972e-4a55-9847-24760df26ddf-kube-api-access-58cmz\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.622827 master-0 kubenswrapper[23041]: I0308 00:55:23.622757 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.638888 master-0 kubenswrapper[23041]: I0308 00:55:23.638822 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.639224 master-0 kubenswrapper[23041]: I0308 00:55:23.639174 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbcd7de-972e-4a55-9847-24760df26ddf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.645772 master-0 kubenswrapper[23041]: I0308 00:55:23.645715 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58cmz\" (UniqueName: \"kubernetes.io/projected/acbcd7de-972e-4a55-9847-24760df26ddf-kube-api-access-58cmz\") pod \"nova-cell1-conductor-0\" (UID: \"acbcd7de-972e-4a55-9847-24760df26ddf\") " pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:23.692283 master-0 kubenswrapper[23041]: I0308 00:55:23.691041 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:24.215346 master-0 kubenswrapper[23041]: I0308 00:55:24.213903 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 00:55:24.371185 master-0 kubenswrapper[23041]: I0308 00:55:24.368696 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"0966ecc84c7af253856596da017e6e220469f0e5cd32794c05171c432cfa2a55"} Mar 08 00:55:24.371185 master-0 kubenswrapper[23041]: I0308 00:55:24.368745 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5fd31740-3478-41e5-8295-d4b50f40db04","Type":"ContainerStarted","Data":"d1b9b7696bad7e109ae6c686871ed317b1a66aba9ba4fce914572cc5ce1c4d14"} Mar 08 00:55:24.371185 master-0 kubenswrapper[23041]: I0308 00:55:24.370752 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 00:55:24.371185 master-0 kubenswrapper[23041]: I0308 00:55:24.370785 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 00:55:24.374368 master-0 kubenswrapper[23041]: I0308 00:55:24.372160 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"acbcd7de-972e-4a55-9847-24760df26ddf","Type":"ContainerStarted","Data":"da42a0a5e1b31c4c930240554e18a0778602bfb3ea2d9c19f0053e50894ea222"} Mar 08 00:55:24.374368 master-0 kubenswrapper[23041]: I0308 00:55:24.372219 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-log" containerID="cri-o://67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c" gracePeriod=30 Mar 08 00:55:24.374368 master-0 kubenswrapper[23041]: I0308 00:55:24.373326 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-metadata" containerID="cri-o://878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3" gracePeriod=30 Mar 08 00:55:24.397679 master-0 kubenswrapper[23041]: E0308 00:55:24.397603 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:55:24.407589 master-0 kubenswrapper[23041]: E0308 00:55:24.400097 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:55:24.407589 master-0 kubenswrapper[23041]: I0308 00:55:24.404408 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=70.121913294 podStartE2EDuration="1m52.404395074s" podCreationTimestamp="2026-03-08 00:53:32 +0000 UTC" firstStartedPulling="2026-03-08 00:53:45.302589522 +0000 UTC m=+1330.775426076" lastFinishedPulling="2026-03-08 00:54:27.585071282 +0000 UTC m=+1373.057907856" observedRunningTime="2026-03-08 00:55:24.399562086 +0000 UTC m=+1429.872398640" watchObservedRunningTime="2026-03-08 00:55:24.404395074 +0000 UTC m=+1429.877231628" Mar 08 00:55:24.417923 master-0 kubenswrapper[23041]: E0308 00:55:24.417845 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:55:24.418090 master-0 kubenswrapper[23041]: E0308 00:55:24.417930 23041 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerName="nova-scheduler-scheduler" Mar 08 00:55:24.562549 master-0 kubenswrapper[23041]: I0308 00:55:24.562449 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 08 00:55:25.396265 master-0 kubenswrapper[23041]: I0308 00:55:25.396188 23041 generic.go:334] "Generic (PLEG): container finished" podID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerID="67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c" exitCode=143 Mar 08 00:55:25.396915 master-0 kubenswrapper[23041]: I0308 00:55:25.396357 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerDied","Data":"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c"} Mar 08 00:55:25.399613 master-0 kubenswrapper[23041]: I0308 00:55:25.399583 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"acbcd7de-972e-4a55-9847-24760df26ddf","Type":"ContainerStarted","Data":"cc36163a021d0d4b0b727eaf0178a11bb278c3aab16a1b28a5967ad7f4b61179"} Mar 08 00:55:25.399687 master-0 kubenswrapper[23041]: I0308 00:55:25.399616 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:25.432351 master-0 kubenswrapper[23041]: I0308 00:55:25.432233 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.432190051 podStartE2EDuration="2.432190051s" podCreationTimestamp="2026-03-08 00:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:25.423774026 +0000 UTC m=+1430.896610580" watchObservedRunningTime="2026-03-08 00:55:25.432190051 +0000 UTC m=+1430.905026615" Mar 08 00:55:26.010620 master-0 kubenswrapper[23041]: I0308 00:55:26.010346 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/ironic-conductor-0" podUID="5fd31740-3478-41e5-8295-d4b50f40db04" containerName="ironic-conductor" probeResult="failure" output=< Mar 08 00:55:26.010620 master-0 kubenswrapper[23041]: ironic-conductor-0 is offline Mar 08 00:55:26.010620 master-0 kubenswrapper[23041]: > Mar 08 00:55:26.416563 master-0 kubenswrapper[23041]: I0308 00:55:26.416497 23041 generic.go:334] "Generic (PLEG): container finished" podID="988af452-10f3-4d0b-806e-7f91af9135a0" containerID="bb665cfb84bc4238ce4c3fbf323d934c308aa657e54c21f960c2f71b319f0afb" exitCode=0 Mar 08 00:55:26.417078 master-0 kubenswrapper[23041]: I0308 00:55:26.416565 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerDied","Data":"bb665cfb84bc4238ce4c3fbf323d934c308aa657e54c21f960c2f71b319f0afb"} Mar 08 00:55:26.714923 master-0 kubenswrapper[23041]: I0308 00:55:26.714883 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:26.834633 master-0 kubenswrapper[23041]: I0308 00:55:26.833425 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvpnt\" (UniqueName: \"kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt\") pod \"988af452-10f3-4d0b-806e-7f91af9135a0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " Mar 08 00:55:26.834633 master-0 kubenswrapper[23041]: I0308 00:55:26.833537 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data\") pod \"988af452-10f3-4d0b-806e-7f91af9135a0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " Mar 08 00:55:26.834633 master-0 kubenswrapper[23041]: I0308 00:55:26.834503 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs" (OuterVolumeSpecName: "logs") pod "988af452-10f3-4d0b-806e-7f91af9135a0" (UID: "988af452-10f3-4d0b-806e-7f91af9135a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:55:26.834633 master-0 kubenswrapper[23041]: I0308 00:55:26.834601 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs\") pod \"988af452-10f3-4d0b-806e-7f91af9135a0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " Mar 08 00:55:26.834925 master-0 kubenswrapper[23041]: I0308 00:55:26.834683 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle\") pod \"988af452-10f3-4d0b-806e-7f91af9135a0\" (UID: \"988af452-10f3-4d0b-806e-7f91af9135a0\") " Mar 08 00:55:26.835509 master-0 kubenswrapper[23041]: I0308 00:55:26.835477 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988af452-10f3-4d0b-806e-7f91af9135a0-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:26.859629 master-0 kubenswrapper[23041]: I0308 00:55:26.859572 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt" (OuterVolumeSpecName: "kube-api-access-wvpnt") pod "988af452-10f3-4d0b-806e-7f91af9135a0" (UID: "988af452-10f3-4d0b-806e-7f91af9135a0"). InnerVolumeSpecName "kube-api-access-wvpnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:26.863503 master-0 kubenswrapper[23041]: I0308 00:55:26.863451 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data" (OuterVolumeSpecName: "config-data") pod "988af452-10f3-4d0b-806e-7f91af9135a0" (UID: "988af452-10f3-4d0b-806e-7f91af9135a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:26.900456 master-0 kubenswrapper[23041]: I0308 00:55:26.900404 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "988af452-10f3-4d0b-806e-7f91af9135a0" (UID: "988af452-10f3-4d0b-806e-7f91af9135a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:26.941259 master-0 kubenswrapper[23041]: I0308 00:55:26.941220 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvpnt\" (UniqueName: \"kubernetes.io/projected/988af452-10f3-4d0b-806e-7f91af9135a0-kube-api-access-wvpnt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:26.941710 master-0 kubenswrapper[23041]: I0308 00:55:26.941692 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:26.944177 master-0 kubenswrapper[23041]: I0308 00:55:26.944151 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988af452-10f3-4d0b-806e-7f91af9135a0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:27.425214 master-0 kubenswrapper[23041]: I0308 00:55:27.425150 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:27.437600 master-0 kubenswrapper[23041]: I0308 00:55:27.437556 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"988af452-10f3-4d0b-806e-7f91af9135a0","Type":"ContainerDied","Data":"4e4fbc384040bd08eaddf6bd7d8de442a6e60189315344944f10ed1d16f243ae"} Mar 08 00:55:27.437795 master-0 kubenswrapper[23041]: I0308 00:55:27.437780 23041 scope.go:117] "RemoveContainer" containerID="bb665cfb84bc4238ce4c3fbf323d934c308aa657e54c21f960c2f71b319f0afb" Mar 08 00:55:27.438078 master-0 kubenswrapper[23041]: I0308 00:55:27.438011 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:27.446880 master-0 kubenswrapper[23041]: I0308 00:55:27.444568 23041 generic.go:334] "Generic (PLEG): container finished" podID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" exitCode=0 Mar 08 00:55:27.446880 master-0 kubenswrapper[23041]: I0308 00:55:27.444633 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"efeb5e5d-c3a7-4b40-8903-a36ee311f97c","Type":"ContainerDied","Data":"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb"} Mar 08 00:55:27.446880 master-0 kubenswrapper[23041]: I0308 00:55:27.444666 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"efeb5e5d-c3a7-4b40-8903-a36ee311f97c","Type":"ContainerDied","Data":"11f5becedfabf10d62bc28615e4d69d12672ea59d653ece769ee5168fbe0d38d"} Mar 08 00:55:27.446880 master-0 kubenswrapper[23041]: I0308 00:55:27.444736 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:27.476088 master-0 kubenswrapper[23041]: I0308 00:55:27.475774 23041 scope.go:117] "RemoveContainer" containerID="5bef13fadb6620ae32a22db62726707c1135c45202f998789ff9f4806c2a6c6b" Mar 08 00:55:27.524465 master-0 kubenswrapper[23041]: I0308 00:55:27.523174 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:27.526095 master-0 kubenswrapper[23041]: I0308 00:55:27.526052 23041 scope.go:117] "RemoveContainer" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" Mar 08 00:55:27.551100 master-0 kubenswrapper[23041]: I0308 00:55:27.551039 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:27.561563 master-0 kubenswrapper[23041]: I0308 00:55:27.561501 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle\") pod \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " Mar 08 00:55:27.561708 master-0 kubenswrapper[23041]: I0308 00:55:27.561680 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") pod \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " Mar 08 00:55:27.561930 master-0 kubenswrapper[23041]: I0308 00:55:27.561899 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk7bs\" (UniqueName: \"kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs\") pod \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " Mar 08 00:55:27.566134 master-0 kubenswrapper[23041]: I0308 00:55:27.566073 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs" (OuterVolumeSpecName: "kube-api-access-wk7bs") pod "efeb5e5d-c3a7-4b40-8903-a36ee311f97c" (UID: "efeb5e5d-c3a7-4b40-8903-a36ee311f97c"). InnerVolumeSpecName "kube-api-access-wk7bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:27.570911 master-0 kubenswrapper[23041]: I0308 00:55:27.570761 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:27.571802 master-0 kubenswrapper[23041]: E0308 00:55:27.571767 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-log" Mar 08 00:55:27.571802 master-0 kubenswrapper[23041]: I0308 00:55:27.571797 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-log" Mar 08 00:55:27.571926 master-0 kubenswrapper[23041]: E0308 00:55:27.571878 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-api" Mar 08 00:55:27.571926 master-0 kubenswrapper[23041]: I0308 00:55:27.571888 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-api" Mar 08 00:55:27.571926 master-0 kubenswrapper[23041]: E0308 00:55:27.571909 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerName="nova-scheduler-scheduler" Mar 08 00:55:27.571926 master-0 kubenswrapper[23041]: I0308 00:55:27.571917 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerName="nova-scheduler-scheduler" Mar 08 00:55:27.572291 master-0 kubenswrapper[23041]: I0308 00:55:27.572262 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-api" Mar 08 00:55:27.572291 master-0 kubenswrapper[23041]: I0308 00:55:27.572285 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" containerName="nova-scheduler-scheduler" Mar 08 00:55:27.572403 master-0 kubenswrapper[23041]: I0308 00:55:27.572305 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" containerName="nova-api-log" Mar 08 00:55:27.574154 master-0 kubenswrapper[23041]: I0308 00:55:27.574118 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:27.587825 master-0 kubenswrapper[23041]: I0308 00:55:27.587778 23041 scope.go:117] "RemoveContainer" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" Mar 08 00:55:27.588585 master-0 kubenswrapper[23041]: E0308 00:55:27.588535 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb\": container with ID starting with ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb not found: ID does not exist" containerID="ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb" Mar 08 00:55:27.588633 master-0 kubenswrapper[23041]: I0308 00:55:27.588591 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb"} err="failed to get container status \"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb\": rpc error: code = NotFound desc = could not find container \"ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb\": container with ID starting with ef0fb7beacc28a19684e873fc29af183693a7115d13259443ef969b1017b3dcb not found: ID does not exist" Mar 08 00:55:27.589503 master-0 kubenswrapper[23041]: I0308 00:55:27.589472 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 00:55:27.622599 master-0 kubenswrapper[23041]: I0308 00:55:27.610388 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:27.636952 master-0 kubenswrapper[23041]: I0308 00:55:27.634599 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efeb5e5d-c3a7-4b40-8903-a36ee311f97c" (UID: "efeb5e5d-c3a7-4b40-8903-a36ee311f97c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:27.664729 master-0 kubenswrapper[23041]: I0308 00:55:27.664568 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data" (OuterVolumeSpecName: "config-data") pod "efeb5e5d-c3a7-4b40-8903-a36ee311f97c" (UID: "efeb5e5d-c3a7-4b40-8903-a36ee311f97c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:27.666238 master-0 kubenswrapper[23041]: I0308 00:55:27.665679 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") pod \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\" (UID: \"efeb5e5d-c3a7-4b40-8903-a36ee311f97c\") " Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.666766 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff67x\" (UniqueName: \"kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.666886 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.666932 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.666987 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.667313 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk7bs\" (UniqueName: \"kubernetes.io/projected/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-kube-api-access-wk7bs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.667332 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: W0308 00:55:27.667463 23041 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/efeb5e5d-c3a7-4b40-8903-a36ee311f97c/volumes/kubernetes.io~secret/config-data Mar 08 00:55:27.669960 master-0 kubenswrapper[23041]: I0308 00:55:27.667479 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data" (OuterVolumeSpecName: "config-data") pod "efeb5e5d-c3a7-4b40-8903-a36ee311f97c" (UID: "efeb5e5d-c3a7-4b40-8903-a36ee311f97c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.769599 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.769805 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff67x\" (UniqueName: \"kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.769872 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.769900 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.769957 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efeb5e5d-c3a7-4b40-8903-a36ee311f97c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:27.771289 master-0 kubenswrapper[23041]: I0308 00:55:27.770035 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.780281 master-0 kubenswrapper[23041]: I0308 00:55:27.776996 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.786587 master-0 kubenswrapper[23041]: I0308 00:55:27.782582 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.794109 master-0 kubenswrapper[23041]: I0308 00:55:27.794053 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff67x\" (UniqueName: \"kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x\") pod \"nova-api-0\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " pod="openstack/nova-api-0" Mar 08 00:55:27.795631 master-0 kubenswrapper[23041]: I0308 00:55:27.795594 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:27.815382 master-0 kubenswrapper[23041]: I0308 00:55:27.815309 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:27.830564 master-0 kubenswrapper[23041]: I0308 00:55:27.830492 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:27.832402 master-0 kubenswrapper[23041]: I0308 00:55:27.832362 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:27.835048 master-0 kubenswrapper[23041]: I0308 00:55:27.835013 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 00:55:27.850981 master-0 kubenswrapper[23041]: I0308 00:55:27.850916 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:27.977822 master-0 kubenswrapper[23041]: I0308 00:55:27.977735 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:27.978168 master-0 kubenswrapper[23041]: I0308 00:55:27.977853 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:27.978168 master-0 kubenswrapper[23041]: I0308 00:55:27.977982 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5x6\" (UniqueName: \"kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:27.986103 master-0 kubenswrapper[23041]: I0308 00:55:27.986052 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:28.080327 master-0 kubenswrapper[23041]: I0308 00:55:28.079922 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.080740 master-0 kubenswrapper[23041]: I0308 00:55:28.080691 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.080987 master-0 kubenswrapper[23041]: I0308 00:55:28.080965 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5x6\" (UniqueName: \"kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.083846 master-0 kubenswrapper[23041]: I0308 00:55:28.083779 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 08 00:55:28.084751 master-0 kubenswrapper[23041]: I0308 00:55:28.084713 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.088939 master-0 kubenswrapper[23041]: I0308 00:55:28.088835 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.092277 master-0 kubenswrapper[23041]: I0308 00:55:28.092228 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 00:55:28.104910 master-0 kubenswrapper[23041]: I0308 00:55:28.104390 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5x6\" (UniqueName: \"kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6\") pod \"nova-scheduler-0\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " pod="openstack/nova-scheduler-0" Mar 08 00:55:28.178971 master-0 kubenswrapper[23041]: I0308 00:55:28.178868 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:55:28.181676 master-0 kubenswrapper[23041]: I0308 00:55:28.181603 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 00:55:28.569152 master-0 kubenswrapper[23041]: I0308 00:55:28.568999 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:28.704731 master-0 kubenswrapper[23041]: I0308 00:55:28.704622 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:55:28.706600 master-0 kubenswrapper[23041]: W0308 00:55:28.706531 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a4d080_0f65_4840_a0b7_29508c37d813.slice/crio-03d789b5121414cef0bad2988108d64de7af02b6467829a72944fdb38019f56d WatchSource:0}: Error finding container 03d789b5121414cef0bad2988108d64de7af02b6467829a72944fdb38019f56d: Status 404 returned error can't find the container with id 03d789b5121414cef0bad2988108d64de7af02b6467829a72944fdb38019f56d Mar 08 00:55:28.835189 master-0 kubenswrapper[23041]: I0308 00:55:28.835047 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988af452-10f3-4d0b-806e-7f91af9135a0" path="/var/lib/kubelet/pods/988af452-10f3-4d0b-806e-7f91af9135a0/volumes" Mar 08 00:55:28.836054 master-0 kubenswrapper[23041]: I0308 00:55:28.836018 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efeb5e5d-c3a7-4b40-8903-a36ee311f97c" path="/var/lib/kubelet/pods/efeb5e5d-c3a7-4b40-8903-a36ee311f97c/volumes" Mar 08 00:55:29.406696 master-0 kubenswrapper[23041]: I0308 00:55:29.406641 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.495402 23041 generic.go:334] "Generic (PLEG): container finished" podID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerID="878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3" exitCode=0 Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.495466 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.495496 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerDied","Data":"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3"} Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.495632 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"366c5ae2-34a4-413e-8f57-bcbdb57e73d5","Type":"ContainerDied","Data":"08e4b7c6d203bec79ab3c2f47d0632f6d09feb76e5c2e54976899d80232b3642"} Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.495908 23041 scope.go:117] "RemoveContainer" containerID="878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3" Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.498732 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4d080-0f65-4840-a0b7-29508c37d813","Type":"ContainerStarted","Data":"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4"} Mar 08 00:55:29.499269 master-0 kubenswrapper[23041]: I0308 00:55:29.498847 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4d080-0f65-4840-a0b7-29508c37d813","Type":"ContainerStarted","Data":"03d789b5121414cef0bad2988108d64de7af02b6467829a72944fdb38019f56d"} Mar 08 00:55:29.504380 master-0 kubenswrapper[23041]: I0308 00:55:29.504330 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerStarted","Data":"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18"} Mar 08 00:55:29.504443 master-0 kubenswrapper[23041]: I0308 00:55:29.504387 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerStarted","Data":"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72"} Mar 08 00:55:29.504443 master-0 kubenswrapper[23041]: I0308 00:55:29.504403 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerStarted","Data":"e315fa08a41e4a0510f9671e764d6a7d5a941d3a60076668adbe5085bb007d48"} Mar 08 00:55:29.521448 master-0 kubenswrapper[23041]: I0308 00:55:29.521398 23041 scope.go:117] "RemoveContainer" containerID="67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c" Mar 08 00:55:29.535691 master-0 kubenswrapper[23041]: I0308 00:55:29.535606 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.53558628 podStartE2EDuration="2.53558628s" podCreationTimestamp="2026-03-08 00:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:29.530910496 +0000 UTC m=+1435.003747070" watchObservedRunningTime="2026-03-08 00:55:29.53558628 +0000 UTC m=+1435.008422834" Mar 08 00:55:29.556224 master-0 kubenswrapper[23041]: I0308 00:55:29.556128 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle\") pod \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " Mar 08 00:55:29.556388 master-0 kubenswrapper[23041]: I0308 00:55:29.556283 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data\") pod \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " Mar 08 00:55:29.556388 master-0 kubenswrapper[23041]: I0308 00:55:29.556380 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs\") pod \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " Mar 08 00:55:29.556492 master-0 kubenswrapper[23041]: I0308 00:55:29.556469 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4pcs\" (UniqueName: \"kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs\") pod \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " Mar 08 00:55:29.556492 master-0 kubenswrapper[23041]: I0308 00:55:29.556491 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs\") pod \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\" (UID: \"366c5ae2-34a4-413e-8f57-bcbdb57e73d5\") " Mar 08 00:55:29.564233 master-0 kubenswrapper[23041]: I0308 00:55:29.564135 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs" (OuterVolumeSpecName: "logs") pod "366c5ae2-34a4-413e-8f57-bcbdb57e73d5" (UID: "366c5ae2-34a4-413e-8f57-bcbdb57e73d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:55:29.578804 master-0 kubenswrapper[23041]: I0308 00:55:29.578568 23041 scope.go:117] "RemoveContainer" containerID="878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3" Mar 08 00:55:29.579491 master-0 kubenswrapper[23041]: E0308 00:55:29.579294 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3\": container with ID starting with 878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3 not found: ID does not exist" containerID="878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3" Mar 08 00:55:29.579491 master-0 kubenswrapper[23041]: I0308 00:55:29.579350 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3"} err="failed to get container status \"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3\": rpc error: code = NotFound desc = could not find container \"878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3\": container with ID starting with 878bb198b87b281ea42f32e5b032aa411c976be0ea3b591a7cb75c788b33efd3 not found: ID does not exist" Mar 08 00:55:29.579491 master-0 kubenswrapper[23041]: I0308 00:55:29.579380 23041 scope.go:117] "RemoveContainer" containerID="67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c" Mar 08 00:55:29.579757 master-0 kubenswrapper[23041]: E0308 00:55:29.579695 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c\": container with ID starting with 67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c not found: ID does not exist" containerID="67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c" Mar 08 00:55:29.579757 master-0 kubenswrapper[23041]: I0308 00:55:29.579742 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c"} err="failed to get container status \"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c\": rpc error: code = NotFound desc = could not find container \"67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c\": container with ID starting with 67bfa88d690d1169e7f7403b4c87ee6b5d5fafab73e6d9f8b37c42f21859ec1c not found: ID does not exist" Mar 08 00:55:29.580228 master-0 kubenswrapper[23041]: I0308 00:55:29.580135 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs" (OuterVolumeSpecName: "kube-api-access-z4pcs") pod "366c5ae2-34a4-413e-8f57-bcbdb57e73d5" (UID: "366c5ae2-34a4-413e-8f57-bcbdb57e73d5"). InnerVolumeSpecName "kube-api-access-z4pcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:29.597039 master-0 kubenswrapper[23041]: I0308 00:55:29.596965 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.59694446 podStartE2EDuration="2.59694446s" podCreationTimestamp="2026-03-08 00:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:29.565898271 +0000 UTC m=+1435.038734825" watchObservedRunningTime="2026-03-08 00:55:29.59694446 +0000 UTC m=+1435.069781014" Mar 08 00:55:29.598859 master-0 kubenswrapper[23041]: I0308 00:55:29.598812 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "366c5ae2-34a4-413e-8f57-bcbdb57e73d5" (UID: "366c5ae2-34a4-413e-8f57-bcbdb57e73d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:29.604252 master-0 kubenswrapper[23041]: I0308 00:55:29.604053 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data" (OuterVolumeSpecName: "config-data") pod "366c5ae2-34a4-413e-8f57-bcbdb57e73d5" (UID: "366c5ae2-34a4-413e-8f57-bcbdb57e73d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:29.628323 master-0 kubenswrapper[23041]: I0308 00:55:29.627522 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "366c5ae2-34a4-413e-8f57-bcbdb57e73d5" (UID: "366c5ae2-34a4-413e-8f57-bcbdb57e73d5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:29.662003 master-0 kubenswrapper[23041]: I0308 00:55:29.661959 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:29.662003 master-0 kubenswrapper[23041]: I0308 00:55:29.662001 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:29.662127 master-0 kubenswrapper[23041]: I0308 00:55:29.662011 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:29.662127 master-0 kubenswrapper[23041]: I0308 00:55:29.662022 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4pcs\" (UniqueName: \"kubernetes.io/projected/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-kube-api-access-z4pcs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:29.662127 master-0 kubenswrapper[23041]: I0308 00:55:29.662035 23041 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/366c5ae2-34a4-413e-8f57-bcbdb57e73d5-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:29.855317 master-0 kubenswrapper[23041]: I0308 00:55:29.853473 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:29.898297 master-0 kubenswrapper[23041]: I0308 00:55:29.895755 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:29.968007 master-0 kubenswrapper[23041]: I0308 00:55:29.967807 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:29.969276 master-0 kubenswrapper[23041]: E0308 00:55:29.969234 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-log" Mar 08 00:55:29.969276 master-0 kubenswrapper[23041]: I0308 00:55:29.969258 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-log" Mar 08 00:55:29.969415 master-0 kubenswrapper[23041]: E0308 00:55:29.969296 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-metadata" Mar 08 00:55:29.969415 master-0 kubenswrapper[23041]: I0308 00:55:29.969304 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-metadata" Mar 08 00:55:29.969618 master-0 kubenswrapper[23041]: I0308 00:55:29.969565 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-log" Mar 08 00:55:29.969618 master-0 kubenswrapper[23041]: I0308 00:55:29.969593 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" containerName="nova-metadata-metadata" Mar 08 00:55:29.971163 master-0 kubenswrapper[23041]: I0308 00:55:29.971136 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:29.973759 master-0 kubenswrapper[23041]: I0308 00:55:29.973711 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 00:55:29.973980 master-0 kubenswrapper[23041]: I0308 00:55:29.973946 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 00:55:30.000237 master-0 kubenswrapper[23041]: I0308 00:55:30.000115 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:30.095388 master-0 kubenswrapper[23041]: I0308 00:55:30.095250 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.095569 master-0 kubenswrapper[23041]: I0308 00:55:30.095411 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.095569 master-0 kubenswrapper[23041]: I0308 00:55:30.095467 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.095938 master-0 kubenswrapper[23041]: I0308 00:55:30.095825 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.096046 master-0 kubenswrapper[23041]: I0308 00:55:30.096007 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wsjd\" (UniqueName: \"kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.198892 master-0 kubenswrapper[23041]: I0308 00:55:30.198819 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.199118 master-0 kubenswrapper[23041]: I0308 00:55:30.198901 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.199118 master-0 kubenswrapper[23041]: I0308 00:55:30.198955 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.199280 master-0 kubenswrapper[23041]: I0308 00:55:30.199223 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.199348 master-0 kubenswrapper[23041]: I0308 00:55:30.199312 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wsjd\" (UniqueName: \"kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.203772 master-0 kubenswrapper[23041]: I0308 00:55:30.203734 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.204115 master-0 kubenswrapper[23041]: I0308 00:55:30.204062 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.207773 master-0 kubenswrapper[23041]: I0308 00:55:30.207709 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.212161 master-0 kubenswrapper[23041]: I0308 00:55:30.212127 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.223059 master-0 kubenswrapper[23041]: I0308 00:55:30.221883 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wsjd\" (UniqueName: \"kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd\") pod \"nova-metadata-0\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " pod="openstack/nova-metadata-0" Mar 08 00:55:30.291115 master-0 kubenswrapper[23041]: I0308 00:55:30.291046 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:55:30.779965 master-0 kubenswrapper[23041]: I0308 00:55:30.779920 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:55:30.826730 master-0 kubenswrapper[23041]: I0308 00:55:30.826674 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366c5ae2-34a4-413e-8f57-bcbdb57e73d5" path="/var/lib/kubelet/pods/366c5ae2-34a4-413e-8f57-bcbdb57e73d5/volumes" Mar 08 00:55:31.540893 master-0 kubenswrapper[23041]: I0308 00:55:31.540794 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerStarted","Data":"6d1fe690bafc8f068ccbf54f8c61e41f9ff6888d3cfb0b9b19d9487a1bdae0c7"} Mar 08 00:55:31.540893 master-0 kubenswrapper[23041]: I0308 00:55:31.540891 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerStarted","Data":"0b657ec29bb295e0f4d5fc13f9989ac98f93de32096e44da01019e4c1db0d7c7"} Mar 08 00:55:31.541181 master-0 kubenswrapper[23041]: I0308 00:55:31.540904 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerStarted","Data":"e928e3c6e72a2fda28e1883988067ebcf08d0731583608bb2022e60ead201f8b"} Mar 08 00:55:31.954924 master-0 kubenswrapper[23041]: I0308 00:55:31.954714 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.954683718 podStartE2EDuration="2.954683718s" podCreationTimestamp="2026-03-08 00:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:31.950512296 +0000 UTC m=+1437.423348860" watchObservedRunningTime="2026-03-08 00:55:31.954683718 +0000 UTC m=+1437.427520272" Mar 08 00:55:33.180049 master-0 kubenswrapper[23041]: I0308 00:55:33.179973 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 00:55:33.729272 master-0 kubenswrapper[23041]: I0308 00:55:33.729155 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 00:55:35.292359 master-0 kubenswrapper[23041]: I0308 00:55:35.292274 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:55:35.292359 master-0 kubenswrapper[23041]: I0308 00:55:35.292351 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:55:37.989367 master-0 kubenswrapper[23041]: I0308 00:55:37.987017 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:55:37.989367 master-0 kubenswrapper[23041]: I0308 00:55:37.987114 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:55:38.180368 master-0 kubenswrapper[23041]: I0308 00:55:38.180298 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 00:55:38.235084 master-0 kubenswrapper[23041]: I0308 00:55:38.235017 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 00:55:38.720234 master-0 kubenswrapper[23041]: I0308 00:55:38.718050 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 00:55:39.069522 master-0 kubenswrapper[23041]: I0308 00:55:39.069424 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:39.070321 master-0 kubenswrapper[23041]: I0308 00:55:39.069439 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:40.293415 master-0 kubenswrapper[23041]: I0308 00:55:40.293347 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:55:40.295516 master-0 kubenswrapper[23041]: I0308 00:55:40.295448 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:55:41.313533 master-0 kubenswrapper[23041]: I0308 00:55:41.313404 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:41.314186 master-0 kubenswrapper[23041]: I0308 00:55:41.313385 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:55:41.428293 master-0 kubenswrapper[23041]: I0308 00:55:41.428222 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.619705 master-0 kubenswrapper[23041]: I0308 00:55:41.619533 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data\") pod \"7a58922b-c6fd-4668-8b74-674a1cb13323\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " Mar 08 00:55:41.619975 master-0 kubenswrapper[23041]: I0308 00:55:41.619777 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29nps\" (UniqueName: \"kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps\") pod \"7a58922b-c6fd-4668-8b74-674a1cb13323\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " Mar 08 00:55:41.620433 master-0 kubenswrapper[23041]: I0308 00:55:41.620267 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle\") pod \"7a58922b-c6fd-4668-8b74-674a1cb13323\" (UID: \"7a58922b-c6fd-4668-8b74-674a1cb13323\") " Mar 08 00:55:41.623685 master-0 kubenswrapper[23041]: I0308 00:55:41.623628 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps" (OuterVolumeSpecName: "kube-api-access-29nps") pod "7a58922b-c6fd-4668-8b74-674a1cb13323" (UID: "7a58922b-c6fd-4668-8b74-674a1cb13323"). InnerVolumeSpecName "kube-api-access-29nps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:41.652545 master-0 kubenswrapper[23041]: I0308 00:55:41.652380 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a58922b-c6fd-4668-8b74-674a1cb13323" (UID: "7a58922b-c6fd-4668-8b74-674a1cb13323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:41.653244 master-0 kubenswrapper[23041]: I0308 00:55:41.653148 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data" (OuterVolumeSpecName: "config-data") pod "7a58922b-c6fd-4668-8b74-674a1cb13323" (UID: "7a58922b-c6fd-4668-8b74-674a1cb13323"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:41.715341 master-0 kubenswrapper[23041]: I0308 00:55:41.715283 23041 generic.go:334] "Generic (PLEG): container finished" podID="7a58922b-c6fd-4668-8b74-674a1cb13323" containerID="02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287" exitCode=137 Mar 08 00:55:41.715585 master-0 kubenswrapper[23041]: I0308 00:55:41.715365 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.715585 master-0 kubenswrapper[23041]: I0308 00:55:41.715337 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a58922b-c6fd-4668-8b74-674a1cb13323","Type":"ContainerDied","Data":"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287"} Mar 08 00:55:41.715585 master-0 kubenswrapper[23041]: I0308 00:55:41.715514 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a58922b-c6fd-4668-8b74-674a1cb13323","Type":"ContainerDied","Data":"83433891b067770ebd5fe5b882a6dc4717dad3359aabe20388914ed3611ca029"} Mar 08 00:55:41.715585 master-0 kubenswrapper[23041]: I0308 00:55:41.715532 23041 scope.go:117] "RemoveContainer" containerID="02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287" Mar 08 00:55:41.723726 master-0 kubenswrapper[23041]: I0308 00:55:41.722509 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:41.723726 master-0 kubenswrapper[23041]: I0308 00:55:41.722546 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29nps\" (UniqueName: \"kubernetes.io/projected/7a58922b-c6fd-4668-8b74-674a1cb13323-kube-api-access-29nps\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:41.723726 master-0 kubenswrapper[23041]: I0308 00:55:41.722557 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a58922b-c6fd-4668-8b74-674a1cb13323-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:41.743506 master-0 kubenswrapper[23041]: I0308 00:55:41.743425 23041 scope.go:117] "RemoveContainer" containerID="02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287" Mar 08 00:55:41.744707 master-0 kubenswrapper[23041]: E0308 00:55:41.744627 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287\": container with ID starting with 02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287 not found: ID does not exist" containerID="02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287" Mar 08 00:55:41.744707 master-0 kubenswrapper[23041]: I0308 00:55:41.744681 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287"} err="failed to get container status \"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287\": rpc error: code = NotFound desc = could not find container \"02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287\": container with ID starting with 02f870291a140fba4ae8125d44d2946884f3ed615efbd920b3f263a781a3e287 not found: ID does not exist" Mar 08 00:55:41.766691 master-0 kubenswrapper[23041]: I0308 00:55:41.763022 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:41.777585 master-0 kubenswrapper[23041]: I0308 00:55:41.777487 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:41.800239 master-0 kubenswrapper[23041]: I0308 00:55:41.799687 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:41.800239 master-0 kubenswrapper[23041]: E0308 00:55:41.800245 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a58922b-c6fd-4668-8b74-674a1cb13323" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 00:55:41.800239 master-0 kubenswrapper[23041]: I0308 00:55:41.800259 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a58922b-c6fd-4668-8b74-674a1cb13323" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 00:55:41.800733 master-0 kubenswrapper[23041]: I0308 00:55:41.800480 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a58922b-c6fd-4668-8b74-674a1cb13323" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 00:55:41.802560 master-0 kubenswrapper[23041]: I0308 00:55:41.801219 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.842075 master-0 kubenswrapper[23041]: I0308 00:55:41.818366 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:41.844944 master-0 kubenswrapper[23041]: I0308 00:55:41.844906 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 00:55:41.845290 master-0 kubenswrapper[23041]: I0308 00:55:41.845258 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 00:55:41.847076 master-0 kubenswrapper[23041]: I0308 00:55:41.847016 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 00:55:41.926599 master-0 kubenswrapper[23041]: I0308 00:55:41.926459 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.928977 master-0 kubenswrapper[23041]: I0308 00:55:41.927459 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.928977 master-0 kubenswrapper[23041]: I0308 00:55:41.927516 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc95v\" (UniqueName: \"kubernetes.io/projected/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-kube-api-access-bc95v\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.928977 master-0 kubenswrapper[23041]: I0308 00:55:41.927846 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:41.928977 master-0 kubenswrapper[23041]: I0308 00:55:41.928079 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.030614 master-0 kubenswrapper[23041]: I0308 00:55:42.030092 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.030614 master-0 kubenswrapper[23041]: I0308 00:55:42.030279 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.030614 master-0 kubenswrapper[23041]: I0308 00:55:42.030332 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.030614 master-0 kubenswrapper[23041]: I0308 00:55:42.030352 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc95v\" (UniqueName: \"kubernetes.io/projected/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-kube-api-access-bc95v\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.030614 master-0 kubenswrapper[23041]: I0308 00:55:42.030400 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.032981 master-0 kubenswrapper[23041]: I0308 00:55:42.032938 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.033677 master-0 kubenswrapper[23041]: I0308 00:55:42.033629 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.051119 master-0 kubenswrapper[23041]: I0308 00:55:42.050900 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.051575 master-0 kubenswrapper[23041]: I0308 00:55:42.051516 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.053250 master-0 kubenswrapper[23041]: I0308 00:55:42.053183 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc95v\" (UniqueName: \"kubernetes.io/projected/c85f49c6-2a1f-4b42-81dd-77e82eaeb835-kube-api-access-bc95v\") pod \"nova-cell1-novncproxy-0\" (UID: \"c85f49c6-2a1f-4b42-81dd-77e82eaeb835\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.156545 master-0 kubenswrapper[23041]: I0308 00:55:42.156485 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:42.743492 master-0 kubenswrapper[23041]: I0308 00:55:42.743449 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 00:55:42.826133 master-0 kubenswrapper[23041]: I0308 00:55:42.826055 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a58922b-c6fd-4668-8b74-674a1cb13323" path="/var/lib/kubelet/pods/7a58922b-c6fd-4668-8b74-674a1cb13323/volumes" Mar 08 00:55:43.752115 master-0 kubenswrapper[23041]: I0308 00:55:43.752048 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c85f49c6-2a1f-4b42-81dd-77e82eaeb835","Type":"ContainerStarted","Data":"8fec6fdd0a701dcf053bcfbe56ae12a05a25d2ba896a9734b87f0b73ae7d6a44"} Mar 08 00:55:43.752736 master-0 kubenswrapper[23041]: I0308 00:55:43.752145 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c85f49c6-2a1f-4b42-81dd-77e82eaeb835","Type":"ContainerStarted","Data":"eebf9ae249afe5f2bac19f48238ececbd6117d6e483a8e2b7b2c92059716eb2b"} Mar 08 00:55:43.781683 master-0 kubenswrapper[23041]: I0308 00:55:43.781601 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.781577204 podStartE2EDuration="2.781577204s" podCreationTimestamp="2026-03-08 00:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:43.77240107 +0000 UTC m=+1449.245237624" watchObservedRunningTime="2026-03-08 00:55:43.781577204 +0000 UTC m=+1449.254413758" Mar 08 00:55:47.157435 master-0 kubenswrapper[23041]: I0308 00:55:47.157339 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:47.991521 master-0 kubenswrapper[23041]: I0308 00:55:47.991462 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 00:55:47.992915 master-0 kubenswrapper[23041]: I0308 00:55:47.992861 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 00:55:47.993147 master-0 kubenswrapper[23041]: I0308 00:55:47.993120 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 00:55:47.996029 master-0 kubenswrapper[23041]: I0308 00:55:47.995984 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 00:55:48.826093 master-0 kubenswrapper[23041]: I0308 00:55:48.826052 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 00:55:48.826958 master-0 kubenswrapper[23041]: I0308 00:55:48.826938 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 00:55:49.110336 master-0 kubenswrapper[23041]: I0308 00:55:49.110276 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764cc67dbc-n94p5"] Mar 08 00:55:49.112586 master-0 kubenswrapper[23041]: I0308 00:55:49.112543 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.141507 master-0 kubenswrapper[23041]: I0308 00:55:49.141418 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764cc67dbc-n94p5"] Mar 08 00:55:49.261884 master-0 kubenswrapper[23041]: I0308 00:55:49.261812 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-svc\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.262155 master-0 kubenswrapper[23041]: I0308 00:55:49.261922 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-nb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.262155 master-0 kubenswrapper[23041]: I0308 00:55:49.261954 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-config\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.262155 master-0 kubenswrapper[23041]: I0308 00:55:49.261978 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-sb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.262155 master-0 kubenswrapper[23041]: I0308 00:55:49.262055 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-swift-storage-0\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.262155 master-0 kubenswrapper[23041]: I0308 00:55:49.262088 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgtx\" (UniqueName: \"kubernetes.io/projected/14ed2327-fe3a-4f78-b30f-3ccb279385e2-kube-api-access-mvgtx\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.373886 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-svc\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.373999 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-nb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.374029 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-config\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.374051 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-sb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.374135 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-swift-storage-0\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375443 master-0 kubenswrapper[23041]: I0308 00:55:49.374161 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgtx\" (UniqueName: \"kubernetes.io/projected/14ed2327-fe3a-4f78-b30f-3ccb279385e2-kube-api-access-mvgtx\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.375911 master-0 kubenswrapper[23041]: I0308 00:55:49.375554 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-svc\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.376565 master-0 kubenswrapper[23041]: I0308 00:55:49.376309 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-nb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.376565 master-0 kubenswrapper[23041]: I0308 00:55:49.376350 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-dns-swift-storage-0\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.379645 master-0 kubenswrapper[23041]: I0308 00:55:49.377331 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-ovsdbserver-sb\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.379645 master-0 kubenswrapper[23041]: I0308 00:55:49.377597 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed2327-fe3a-4f78-b30f-3ccb279385e2-config\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.402244 master-0 kubenswrapper[23041]: I0308 00:55:49.401122 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgtx\" (UniqueName: \"kubernetes.io/projected/14ed2327-fe3a-4f78-b30f-3ccb279385e2-kube-api-access-mvgtx\") pod \"dnsmasq-dns-764cc67dbc-n94p5\" (UID: \"14ed2327-fe3a-4f78-b30f-3ccb279385e2\") " pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.458264 master-0 kubenswrapper[23041]: I0308 00:55:49.458188 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:49.983910 master-0 kubenswrapper[23041]: I0308 00:55:49.983844 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764cc67dbc-n94p5"] Mar 08 00:55:50.297605 master-0 kubenswrapper[23041]: I0308 00:55:50.297534 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 00:55:50.299705 master-0 kubenswrapper[23041]: I0308 00:55:50.299669 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 00:55:50.304118 master-0 kubenswrapper[23041]: I0308 00:55:50.304067 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 00:55:50.847685 master-0 kubenswrapper[23041]: I0308 00:55:50.847611 23041 generic.go:334] "Generic (PLEG): container finished" podID="14ed2327-fe3a-4f78-b30f-3ccb279385e2" containerID="d90efda08cdf77eb0d97b06646ac61deea13b94d4606505b84a1df9e213e20b5" exitCode=0 Mar 08 00:55:50.847685 master-0 kubenswrapper[23041]: I0308 00:55:50.847659 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" event={"ID":"14ed2327-fe3a-4f78-b30f-3ccb279385e2","Type":"ContainerDied","Data":"d90efda08cdf77eb0d97b06646ac61deea13b94d4606505b84a1df9e213e20b5"} Mar 08 00:55:50.847685 master-0 kubenswrapper[23041]: I0308 00:55:50.847691 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" event={"ID":"14ed2327-fe3a-4f78-b30f-3ccb279385e2","Type":"ContainerStarted","Data":"8246afa13331525343ecf4e60b0e4a556b0c6d9e496c1545194d594c40c6b9e9"} Mar 08 00:55:50.854584 master-0 kubenswrapper[23041]: I0308 00:55:50.854538 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 00:55:51.860957 master-0 kubenswrapper[23041]: I0308 00:55:51.860861 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" event={"ID":"14ed2327-fe3a-4f78-b30f-3ccb279385e2","Type":"ContainerStarted","Data":"f512072f39e4464cc56b7be6e420e125547cd9001bd63a77fbcf1e65ebd7d479"} Mar 08 00:55:51.890456 master-0 kubenswrapper[23041]: I0308 00:55:51.890373 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" podStartSLOduration=2.890350237 podStartE2EDuration="2.890350237s" podCreationTimestamp="2026-03-08 00:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:51.886533094 +0000 UTC m=+1457.359369658" watchObservedRunningTime="2026-03-08 00:55:51.890350237 +0000 UTC m=+1457.363186801" Mar 08 00:55:52.134260 master-0 kubenswrapper[23041]: I0308 00:55:52.134104 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:52.134477 master-0 kubenswrapper[23041]: I0308 00:55:52.134400 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-log" containerID="cri-o://f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72" gracePeriod=30 Mar 08 00:55:52.134544 master-0 kubenswrapper[23041]: I0308 00:55:52.134472 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-api" containerID="cri-o://9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18" gracePeriod=30 Mar 08 00:55:52.157467 master-0 kubenswrapper[23041]: I0308 00:55:52.157405 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:52.185001 master-0 kubenswrapper[23041]: I0308 00:55:52.184931 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:52.880739 master-0 kubenswrapper[23041]: I0308 00:55:52.880673 23041 generic.go:334] "Generic (PLEG): container finished" podID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerID="f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72" exitCode=143 Mar 08 00:55:52.881356 master-0 kubenswrapper[23041]: I0308 00:55:52.880854 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerDied","Data":"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72"} Mar 08 00:55:52.881522 master-0 kubenswrapper[23041]: I0308 00:55:52.881490 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:52.898087 master-0 kubenswrapper[23041]: I0308 00:55:52.898027 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 00:55:53.219428 master-0 kubenswrapper[23041]: I0308 00:55:53.217737 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wn7mj"] Mar 08 00:55:53.219606 master-0 kubenswrapper[23041]: I0308 00:55:53.219465 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.226147 master-0 kubenswrapper[23041]: I0308 00:55:53.225727 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 00:55:53.226428 master-0 kubenswrapper[23041]: I0308 00:55:53.226386 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 00:55:53.259098 master-0 kubenswrapper[23041]: I0308 00:55:53.258233 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wn7mj"] Mar 08 00:55:53.288827 master-0 kubenswrapper[23041]: I0308 00:55:53.288735 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-s5c66"] Mar 08 00:55:53.300088 master-0 kubenswrapper[23041]: I0308 00:55:53.299941 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-s5c66"] Mar 08 00:55:53.300088 master-0 kubenswrapper[23041]: I0308 00:55:53.300044 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.313467 master-0 kubenswrapper[23041]: I0308 00:55:53.313384 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.313563 master-0 kubenswrapper[23041]: I0308 00:55:53.313540 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.313689 master-0 kubenswrapper[23041]: I0308 00:55:53.313660 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.313754 master-0 kubenswrapper[23041]: I0308 00:55:53.313690 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2npp\" (UniqueName: \"kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.416189 master-0 kubenswrapper[23041]: I0308 00:55:53.416143 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.416461 master-0 kubenswrapper[23041]: I0308 00:55:53.416442 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.416575 master-0 kubenswrapper[23041]: I0308 00:55:53.416561 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2npp\" (UniqueName: \"kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.416713 master-0 kubenswrapper[23041]: I0308 00:55:53.416699 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.416894 master-0 kubenswrapper[23041]: I0308 00:55:53.416880 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.416992 master-0 kubenswrapper[23041]: I0308 00:55:53.416977 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.417142 master-0 kubenswrapper[23041]: I0308 00:55:53.417126 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvrt\" (UniqueName: \"kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.417244 master-0 kubenswrapper[23041]: I0308 00:55:53.417230 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.421120 master-0 kubenswrapper[23041]: I0308 00:55:53.421072 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.421350 master-0 kubenswrapper[23041]: I0308 00:55:53.421267 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.422602 master-0 kubenswrapper[23041]: I0308 00:55:53.422571 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.443958 master-0 kubenswrapper[23041]: I0308 00:55:53.443897 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2npp\" (UniqueName: \"kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp\") pod \"nova-cell1-cell-mapping-wn7mj\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.519712 master-0 kubenswrapper[23041]: I0308 00:55:53.519602 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvrt\" (UniqueName: \"kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.520093 master-0 kubenswrapper[23041]: I0308 00:55:53.519805 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.520093 master-0 kubenswrapper[23041]: I0308 00:55:53.519846 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.520093 master-0 kubenswrapper[23041]: I0308 00:55:53.520021 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.529392 master-0 kubenswrapper[23041]: I0308 00:55:53.523521 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.529392 master-0 kubenswrapper[23041]: I0308 00:55:53.523785 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.530297 master-0 kubenswrapper[23041]: I0308 00:55:53.530145 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.549668 master-0 kubenswrapper[23041]: I0308 00:55:53.549587 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvrt\" (UniqueName: \"kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt\") pod \"nova-cell1-host-discover-s5c66\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:53.550479 master-0 kubenswrapper[23041]: I0308 00:55:53.550430 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:55:53.617800 master-0 kubenswrapper[23041]: I0308 00:55:53.617723 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:54.053682 master-0 kubenswrapper[23041]: W0308 00:55:54.053599 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb898cbca_4bda_4100_8cb5_8adc12f5a160.slice/crio-2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28 WatchSource:0}: Error finding container 2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28: Status 404 returned error can't find the container with id 2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28 Mar 08 00:55:54.064402 master-0 kubenswrapper[23041]: I0308 00:55:54.064345 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wn7mj"] Mar 08 00:55:54.203145 master-0 kubenswrapper[23041]: W0308 00:55:54.203090 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cec7d09_3b2d_41be_92b9_56ea5fbfa9d2.slice/crio-f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64 WatchSource:0}: Error finding container f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64: Status 404 returned error can't find the container with id f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64 Mar 08 00:55:54.207626 master-0 kubenswrapper[23041]: I0308 00:55:54.204543 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-s5c66"] Mar 08 00:55:54.916896 master-0 kubenswrapper[23041]: I0308 00:55:54.916842 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-s5c66" event={"ID":"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2","Type":"ContainerStarted","Data":"cd6cdf6d53c1514430c49d52a7dfae35c6a670f895c07d878ffac1ac42affc51"} Mar 08 00:55:54.917128 master-0 kubenswrapper[23041]: I0308 00:55:54.917113 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-s5c66" event={"ID":"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2","Type":"ContainerStarted","Data":"f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64"} Mar 08 00:55:54.919652 master-0 kubenswrapper[23041]: I0308 00:55:54.919621 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wn7mj" event={"ID":"b898cbca-4bda-4100-8cb5-8adc12f5a160","Type":"ContainerStarted","Data":"d1fb722fa89742527024e3c6ee65f26ec2e8962021c85cba367a09a4f701c1c5"} Mar 08 00:55:54.919802 master-0 kubenswrapper[23041]: I0308 00:55:54.919783 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wn7mj" event={"ID":"b898cbca-4bda-4100-8cb5-8adc12f5a160","Type":"ContainerStarted","Data":"2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28"} Mar 08 00:55:54.983846 master-0 kubenswrapper[23041]: I0308 00:55:54.983739 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wn7mj" podStartSLOduration=1.983712053 podStartE2EDuration="1.983712053s" podCreationTimestamp="2026-03-08 00:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:54.97093273 +0000 UTC m=+1460.443769294" watchObservedRunningTime="2026-03-08 00:55:54.983712053 +0000 UTC m=+1460.456548627" Mar 08 00:55:55.017271 master-0 kubenswrapper[23041]: I0308 00:55:55.017168 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-s5c66" podStartSLOduration=2.01714722 podStartE2EDuration="2.01714722s" podCreationTimestamp="2026-03-08 00:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:55.002000209 +0000 UTC m=+1460.474836783" watchObservedRunningTime="2026-03-08 00:55:55.01714722 +0000 UTC m=+1460.489983774" Mar 08 00:55:55.853394 master-0 kubenswrapper[23041]: I0308 00:55:55.853344 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:55.931564 master-0 kubenswrapper[23041]: I0308 00:55:55.929962 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data\") pod \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " Mar 08 00:55:55.931564 master-0 kubenswrapper[23041]: I0308 00:55:55.930204 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff67x\" (UniqueName: \"kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x\") pod \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " Mar 08 00:55:55.931564 master-0 kubenswrapper[23041]: I0308 00:55:55.930449 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs\") pod \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " Mar 08 00:55:55.931564 master-0 kubenswrapper[23041]: I0308 00:55:55.930527 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle\") pod \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\" (UID: \"1dbc04ae-0661-474e-aee7-ea2a45bc3253\") " Mar 08 00:55:55.933396 master-0 kubenswrapper[23041]: I0308 00:55:55.933352 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs" (OuterVolumeSpecName: "logs") pod "1dbc04ae-0661-474e-aee7-ea2a45bc3253" (UID: "1dbc04ae-0661-474e-aee7-ea2a45bc3253"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:55:55.948616 master-0 kubenswrapper[23041]: I0308 00:55:55.948514 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x" (OuterVolumeSpecName: "kube-api-access-ff67x") pod "1dbc04ae-0661-474e-aee7-ea2a45bc3253" (UID: "1dbc04ae-0661-474e-aee7-ea2a45bc3253"). InnerVolumeSpecName "kube-api-access-ff67x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:55.954197 master-0 kubenswrapper[23041]: I0308 00:55:55.954142 23041 generic.go:334] "Generic (PLEG): container finished" podID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerID="9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18" exitCode=0 Mar 08 00:55:55.955887 master-0 kubenswrapper[23041]: I0308 00:55:55.954550 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:55.955887 master-0 kubenswrapper[23041]: I0308 00:55:55.954603 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerDied","Data":"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18"} Mar 08 00:55:55.955887 master-0 kubenswrapper[23041]: I0308 00:55:55.954656 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1dbc04ae-0661-474e-aee7-ea2a45bc3253","Type":"ContainerDied","Data":"e315fa08a41e4a0510f9671e764d6a7d5a941d3a60076668adbe5085bb007d48"} Mar 08 00:55:55.955887 master-0 kubenswrapper[23041]: I0308 00:55:55.954674 23041 scope.go:117] "RemoveContainer" containerID="9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18" Mar 08 00:55:55.996293 master-0 kubenswrapper[23041]: I0308 00:55:55.996226 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dbc04ae-0661-474e-aee7-ea2a45bc3253" (UID: "1dbc04ae-0661-474e-aee7-ea2a45bc3253"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:56.034752 master-0 kubenswrapper[23041]: I0308 00:55:56.034272 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dbc04ae-0661-474e-aee7-ea2a45bc3253-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:56.034752 master-0 kubenswrapper[23041]: I0308 00:55:56.034311 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:56.034752 master-0 kubenswrapper[23041]: I0308 00:55:56.034322 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff67x\" (UniqueName: \"kubernetes.io/projected/1dbc04ae-0661-474e-aee7-ea2a45bc3253-kube-api-access-ff67x\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:56.045643 master-0 kubenswrapper[23041]: I0308 00:55:56.045586 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data" (OuterVolumeSpecName: "config-data") pod "1dbc04ae-0661-474e-aee7-ea2a45bc3253" (UID: "1dbc04ae-0661-474e-aee7-ea2a45bc3253"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:56.130706 master-0 kubenswrapper[23041]: I0308 00:55:56.130652 23041 scope.go:117] "RemoveContainer" containerID="f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72" Mar 08 00:55:56.136811 master-0 kubenswrapper[23041]: I0308 00:55:56.136644 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbc04ae-0661-474e-aee7-ea2a45bc3253-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:56.165993 master-0 kubenswrapper[23041]: I0308 00:55:56.165064 23041 scope.go:117] "RemoveContainer" containerID="9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18" Mar 08 00:55:56.165993 master-0 kubenswrapper[23041]: E0308 00:55:56.165714 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18\": container with ID starting with 9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18 not found: ID does not exist" containerID="9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18" Mar 08 00:55:56.165993 master-0 kubenswrapper[23041]: I0308 00:55:56.165773 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18"} err="failed to get container status \"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18\": rpc error: code = NotFound desc = could not find container \"9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18\": container with ID starting with 9b5ed217d63f9be94ef001e292ca3f141156624cb2cd4b4eb1f95cf8f7ae0f18 not found: ID does not exist" Mar 08 00:55:56.165993 master-0 kubenswrapper[23041]: I0308 00:55:56.165801 23041 scope.go:117] "RemoveContainer" containerID="f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72" Mar 08 00:55:56.166392 master-0 kubenswrapper[23041]: E0308 00:55:56.166348 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72\": container with ID starting with f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72 not found: ID does not exist" containerID="f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72" Mar 08 00:55:56.166514 master-0 kubenswrapper[23041]: I0308 00:55:56.166397 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72"} err="failed to get container status \"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72\": rpc error: code = NotFound desc = could not find container \"f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72\": container with ID starting with f33507aed2f26fbcb906e27068cfd1520876b2131bf2a505814e76008e264d72 not found: ID does not exist" Mar 08 00:55:56.303771 master-0 kubenswrapper[23041]: I0308 00:55:56.302189 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:56.326425 master-0 kubenswrapper[23041]: I0308 00:55:56.326351 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:56.339586 master-0 kubenswrapper[23041]: I0308 00:55:56.339427 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:56.340392 master-0 kubenswrapper[23041]: E0308 00:55:56.340290 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-log" Mar 08 00:55:56.340392 master-0 kubenswrapper[23041]: I0308 00:55:56.340314 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-log" Mar 08 00:55:56.340392 master-0 kubenswrapper[23041]: E0308 00:55:56.340365 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-api" Mar 08 00:55:56.340392 master-0 kubenswrapper[23041]: I0308 00:55:56.340372 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-api" Mar 08 00:55:56.342109 master-0 kubenswrapper[23041]: I0308 00:55:56.340665 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-log" Mar 08 00:55:56.342109 master-0 kubenswrapper[23041]: I0308 00:55:56.340723 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" containerName="nova-api-api" Mar 08 00:55:56.342357 master-0 kubenswrapper[23041]: I0308 00:55:56.342175 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:56.348506 master-0 kubenswrapper[23041]: I0308 00:55:56.347016 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 00:55:56.348506 master-0 kubenswrapper[23041]: I0308 00:55:56.347361 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 00:55:56.348506 master-0 kubenswrapper[23041]: I0308 00:55:56.347581 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 00:55:56.349980 master-0 kubenswrapper[23041]: I0308 00:55:56.349928 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:56.444034 master-0 kubenswrapper[23041]: I0308 00:55:56.443936 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.444034 master-0 kubenswrapper[23041]: I0308 00:55:56.444029 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.444321 master-0 kubenswrapper[23041]: I0308 00:55:56.444128 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.444321 master-0 kubenswrapper[23041]: I0308 00:55:56.444258 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.444461 master-0 kubenswrapper[23041]: I0308 00:55:56.444368 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.444461 master-0 kubenswrapper[23041]: I0308 00:55:56.444447 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjcq\" (UniqueName: \"kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.546475 master-0 kubenswrapper[23041]: I0308 00:55:56.546345 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.546475 master-0 kubenswrapper[23041]: I0308 00:55:56.546407 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.546475 master-0 kubenswrapper[23041]: I0308 00:55:56.546460 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.546799 master-0 kubenswrapper[23041]: I0308 00:55:56.546695 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.546978 master-0 kubenswrapper[23041]: I0308 00:55:56.546944 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.547088 master-0 kubenswrapper[23041]: I0308 00:55:56.547057 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjcq\" (UniqueName: \"kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.547699 master-0 kubenswrapper[23041]: I0308 00:55:56.547608 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.551883 master-0 kubenswrapper[23041]: I0308 00:55:56.550945 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.551883 master-0 kubenswrapper[23041]: I0308 00:55:56.551790 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.553052 master-0 kubenswrapper[23041]: I0308 00:55:56.552976 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.555942 master-0 kubenswrapper[23041]: I0308 00:55:56.555911 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.566674 master-0 kubenswrapper[23041]: I0308 00:55:56.566615 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjcq\" (UniqueName: \"kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq\") pod \"nova-api-0\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " pod="openstack/nova-api-0" Mar 08 00:55:56.672138 master-0 kubenswrapper[23041]: I0308 00:55:56.672079 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:55:56.824376 master-0 kubenswrapper[23041]: I0308 00:55:56.821346 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbc04ae-0661-474e-aee7-ea2a45bc3253" path="/var/lib/kubelet/pods/1dbc04ae-0661-474e-aee7-ea2a45bc3253/volumes" Mar 08 00:55:57.172635 master-0 kubenswrapper[23041]: I0308 00:55:57.170717 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:55:58.009274 master-0 kubenswrapper[23041]: I0308 00:55:58.007755 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerStarted","Data":"773e1a27e4bf7be0e81952dea884888cedc81d955850d00f144a21f59fc3e9ac"} Mar 08 00:55:58.009274 master-0 kubenswrapper[23041]: I0308 00:55:58.007822 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerStarted","Data":"944def5e84a99e794743d9e83e62bbb64c000d493aa9577c78161857bddbc536"} Mar 08 00:55:58.009274 master-0 kubenswrapper[23041]: I0308 00:55:58.007832 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerStarted","Data":"d21b795ac74b4a9be66e7614d1fe2a87f866051282e770f5b1f9694887a41a93"} Mar 08 00:55:58.013275 master-0 kubenswrapper[23041]: I0308 00:55:58.011518 23041 generic.go:334] "Generic (PLEG): container finished" podID="2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" containerID="cd6cdf6d53c1514430c49d52a7dfae35c6a670f895c07d878ffac1ac42affc51" exitCode=0 Mar 08 00:55:58.013275 master-0 kubenswrapper[23041]: I0308 00:55:58.011604 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-s5c66" event={"ID":"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2","Type":"ContainerDied","Data":"cd6cdf6d53c1514430c49d52a7dfae35c6a670f895c07d878ffac1ac42affc51"} Mar 08 00:55:58.044159 master-0 kubenswrapper[23041]: I0308 00:55:58.043400 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.043378605 podStartE2EDuration="2.043378605s" podCreationTimestamp="2026-03-08 00:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:55:58.032904579 +0000 UTC m=+1463.505741143" watchObservedRunningTime="2026-03-08 00:55:58.043378605 +0000 UTC m=+1463.516215159" Mar 08 00:55:59.029626 master-0 kubenswrapper[23041]: I0308 00:55:59.029551 23041 generic.go:334] "Generic (PLEG): container finished" podID="b898cbca-4bda-4100-8cb5-8adc12f5a160" containerID="d1fb722fa89742527024e3c6ee65f26ec2e8962021c85cba367a09a4f701c1c5" exitCode=0 Mar 08 00:55:59.030477 master-0 kubenswrapper[23041]: I0308 00:55:59.029775 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wn7mj" event={"ID":"b898cbca-4bda-4100-8cb5-8adc12f5a160","Type":"ContainerDied","Data":"d1fb722fa89742527024e3c6ee65f26ec2e8962021c85cba367a09a4f701c1c5"} Mar 08 00:55:59.459490 master-0 kubenswrapper[23041]: I0308 00:55:59.459419 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764cc67dbc-n94p5" Mar 08 00:55:59.519689 master-0 kubenswrapper[23041]: I0308 00:55:59.519645 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:55:59.591100 master-0 kubenswrapper[23041]: I0308 00:55:59.591043 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:55:59.591469 master-0 kubenswrapper[23041]: I0308 00:55:59.591380 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="dnsmasq-dns" containerID="cri-o://622898ca42b3f6734dec3cebdc95676982199ddc85f862ecb5471f61fb68047e" gracePeriod=10 Mar 08 00:55:59.664750 master-0 kubenswrapper[23041]: I0308 00:55:59.664687 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data\") pod \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " Mar 08 00:55:59.665028 master-0 kubenswrapper[23041]: I0308 00:55:59.665002 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle\") pod \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " Mar 08 00:55:59.665172 master-0 kubenswrapper[23041]: I0308 00:55:59.665154 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts\") pod \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " Mar 08 00:55:59.665279 master-0 kubenswrapper[23041]: I0308 00:55:59.665256 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmvrt\" (UniqueName: \"kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt\") pod \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\" (UID: \"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2\") " Mar 08 00:55:59.669692 master-0 kubenswrapper[23041]: I0308 00:55:59.669593 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt" (OuterVolumeSpecName: "kube-api-access-bmvrt") pod "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" (UID: "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2"). InnerVolumeSpecName "kube-api-access-bmvrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:55:59.686352 master-0 kubenswrapper[23041]: I0308 00:55:59.684557 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts" (OuterVolumeSpecName: "scripts") pod "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" (UID: "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:59.722050 master-0 kubenswrapper[23041]: I0308 00:55:59.721825 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" (UID: "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:59.724734 master-0 kubenswrapper[23041]: I0308 00:55:59.724671 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data" (OuterVolumeSpecName: "config-data") pod "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" (UID: "2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:55:59.773287 master-0 kubenswrapper[23041]: I0308 00:55:59.769152 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:59.773287 master-0 kubenswrapper[23041]: I0308 00:55:59.769203 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:59.773287 master-0 kubenswrapper[23041]: I0308 00:55:59.769216 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:55:59.773287 master-0 kubenswrapper[23041]: I0308 00:55:59.769238 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmvrt\" (UniqueName: \"kubernetes.io/projected/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2-kube-api-access-bmvrt\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.069394 master-0 kubenswrapper[23041]: I0308 00:56:00.069266 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" event={"ID":"7a7954fa-b332-42f8-8a94-671129de12dc","Type":"ContainerDied","Data":"622898ca42b3f6734dec3cebdc95676982199ddc85f862ecb5471f61fb68047e"} Mar 08 00:56:00.070014 master-0 kubenswrapper[23041]: I0308 00:56:00.069125 23041 generic.go:334] "Generic (PLEG): container finished" podID="7a7954fa-b332-42f8-8a94-671129de12dc" containerID="622898ca42b3f6734dec3cebdc95676982199ddc85f862ecb5471f61fb68047e" exitCode=0 Mar 08 00:56:00.080258 master-0 kubenswrapper[23041]: I0308 00:56:00.078916 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-s5c66" event={"ID":"2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2","Type":"ContainerDied","Data":"f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64"} Mar 08 00:56:00.080258 master-0 kubenswrapper[23041]: I0308 00:56:00.078961 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f2669b961fedf703af698472c4800a81363ee5289d360b36f50d36a7157b64" Mar 08 00:56:00.080258 master-0 kubenswrapper[23041]: I0308 00:56:00.078994 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-s5c66" Mar 08 00:56:00.240588 master-0 kubenswrapper[23041]: I0308 00:56:00.240532 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:56:00.419029 master-0 kubenswrapper[23041]: I0308 00:56:00.418894 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.419238 master-0 kubenswrapper[23041]: I0308 00:56:00.419106 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.419238 master-0 kubenswrapper[23041]: I0308 00:56:00.419154 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.419366 master-0 kubenswrapper[23041]: I0308 00:56:00.419344 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.419430 master-0 kubenswrapper[23041]: I0308 00:56:00.419417 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.419515 master-0 kubenswrapper[23041]: I0308 00:56:00.419445 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsh22\" (UniqueName: \"kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22\") pod \"7a7954fa-b332-42f8-8a94-671129de12dc\" (UID: \"7a7954fa-b332-42f8-8a94-671129de12dc\") " Mar 08 00:56:00.424711 master-0 kubenswrapper[23041]: I0308 00:56:00.422832 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22" (OuterVolumeSpecName: "kube-api-access-jsh22") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "kube-api-access-jsh22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:00.477380 master-0 kubenswrapper[23041]: I0308 00:56:00.475980 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config" (OuterVolumeSpecName: "config") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:00.477380 master-0 kubenswrapper[23041]: I0308 00:56:00.477321 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:00.481302 master-0 kubenswrapper[23041]: I0308 00:56:00.481086 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:00.488913 master-0 kubenswrapper[23041]: I0308 00:56:00.488840 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:00.492625 master-0 kubenswrapper[23041]: I0308 00:56:00.492588 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a7954fa-b332-42f8-8a94-671129de12dc" (UID: "7a7954fa-b332-42f8-8a94-671129de12dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523282 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523326 23041 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523339 23041 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523348 23041 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523357 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsh22\" (UniqueName: \"kubernetes.io/projected/7a7954fa-b332-42f8-8a94-671129de12dc-kube-api-access-jsh22\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.523391 master-0 kubenswrapper[23041]: I0308 00:56:00.523366 23041 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a7954fa-b332-42f8-8a94-671129de12dc-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.540227 master-0 kubenswrapper[23041]: I0308 00:56:00.540187 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:56:00.625204 master-0 kubenswrapper[23041]: I0308 00:56:00.625141 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle\") pod \"b898cbca-4bda-4100-8cb5-8adc12f5a160\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " Mar 08 00:56:00.625441 master-0 kubenswrapper[23041]: I0308 00:56:00.625419 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2npp\" (UniqueName: \"kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp\") pod \"b898cbca-4bda-4100-8cb5-8adc12f5a160\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " Mar 08 00:56:00.625501 master-0 kubenswrapper[23041]: I0308 00:56:00.625491 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts\") pod \"b898cbca-4bda-4100-8cb5-8adc12f5a160\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " Mar 08 00:56:00.625637 master-0 kubenswrapper[23041]: I0308 00:56:00.625570 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data\") pod \"b898cbca-4bda-4100-8cb5-8adc12f5a160\" (UID: \"b898cbca-4bda-4100-8cb5-8adc12f5a160\") " Mar 08 00:56:00.629252 master-0 kubenswrapper[23041]: I0308 00:56:00.629159 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp" (OuterVolumeSpecName: "kube-api-access-r2npp") pod "b898cbca-4bda-4100-8cb5-8adc12f5a160" (UID: "b898cbca-4bda-4100-8cb5-8adc12f5a160"). InnerVolumeSpecName "kube-api-access-r2npp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:00.631183 master-0 kubenswrapper[23041]: I0308 00:56:00.631124 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts" (OuterVolumeSpecName: "scripts") pod "b898cbca-4bda-4100-8cb5-8adc12f5a160" (UID: "b898cbca-4bda-4100-8cb5-8adc12f5a160"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:00.652840 master-0 kubenswrapper[23041]: I0308 00:56:00.652765 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b898cbca-4bda-4100-8cb5-8adc12f5a160" (UID: "b898cbca-4bda-4100-8cb5-8adc12f5a160"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:00.662597 master-0 kubenswrapper[23041]: I0308 00:56:00.662521 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data" (OuterVolumeSpecName: "config-data") pod "b898cbca-4bda-4100-8cb5-8adc12f5a160" (UID: "b898cbca-4bda-4100-8cb5-8adc12f5a160"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:00.734961 master-0 kubenswrapper[23041]: I0308 00:56:00.734784 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2npp\" (UniqueName: \"kubernetes.io/projected/b898cbca-4bda-4100-8cb5-8adc12f5a160-kube-api-access-r2npp\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.734961 master-0 kubenswrapper[23041]: I0308 00:56:00.734935 23041 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.734961 master-0 kubenswrapper[23041]: I0308 00:56:00.734952 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:00.734961 master-0 kubenswrapper[23041]: I0308 00:56:00.734964 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898cbca-4bda-4100-8cb5-8adc12f5a160-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:01.120371 master-0 kubenswrapper[23041]: I0308 00:56:01.120310 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wn7mj" Mar 08 00:56:01.120371 master-0 kubenswrapper[23041]: I0308 00:56:01.120347 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wn7mj" event={"ID":"b898cbca-4bda-4100-8cb5-8adc12f5a160","Type":"ContainerDied","Data":"2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28"} Mar 08 00:56:01.120371 master-0 kubenswrapper[23041]: I0308 00:56:01.120413 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca3587a1bddbe1e2a2488f81952a1cd23a69d282d12bdf0c4d9eb0a2df5df28" Mar 08 00:56:01.125993 master-0 kubenswrapper[23041]: I0308 00:56:01.125958 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" event={"ID":"7a7954fa-b332-42f8-8a94-671129de12dc","Type":"ContainerDied","Data":"6c840f10982004b152ee8a7160be32f5bd91a8ddba6061df2f43d7a5ffe75a93"} Mar 08 00:56:01.126103 master-0 kubenswrapper[23041]: I0308 00:56:01.126006 23041 scope.go:117] "RemoveContainer" containerID="622898ca42b3f6734dec3cebdc95676982199ddc85f862ecb5471f61fb68047e" Mar 08 00:56:01.126103 master-0 kubenswrapper[23041]: I0308 00:56:01.126033 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" Mar 08 00:56:01.157887 master-0 kubenswrapper[23041]: I0308 00:56:01.157837 23041 scope.go:117] "RemoveContainer" containerID="51ff2a101385007bfd5b9739ab14f4a18b93977cf894baa68480ce356d4b7596" Mar 08 00:56:01.158340 master-0 kubenswrapper[23041]: I0308 00:56:01.158317 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:56:01.168391 master-0 kubenswrapper[23041]: I0308 00:56:01.168327 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67f5b4fdc9-swznp"] Mar 08 00:56:01.283890 master-0 kubenswrapper[23041]: I0308 00:56:01.281609 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:01.283890 master-0 kubenswrapper[23041]: I0308 00:56:01.282009 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-log" containerID="cri-o://944def5e84a99e794743d9e83e62bbb64c000d493aa9577c78161857bddbc536" gracePeriod=30 Mar 08 00:56:01.283890 master-0 kubenswrapper[23041]: I0308 00:56:01.282062 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-api" containerID="cri-o://773e1a27e4bf7be0e81952dea884888cedc81d955850d00f144a21f59fc3e9ac" gracePeriod=30 Mar 08 00:56:01.302260 master-0 kubenswrapper[23041]: I0308 00:56:01.299331 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:01.302260 master-0 kubenswrapper[23041]: I0308 00:56:01.299611 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" containerName="nova-scheduler-scheduler" containerID="cri-o://aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" gracePeriod=30 Mar 08 00:56:01.319495 master-0 kubenswrapper[23041]: I0308 00:56:01.314810 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:01.319495 master-0 kubenswrapper[23041]: I0308 00:56:01.315071 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" containerID="cri-o://0b657ec29bb295e0f4d5fc13f9989ac98f93de32096e44da01019e4c1db0d7c7" gracePeriod=30 Mar 08 00:56:01.319495 master-0 kubenswrapper[23041]: I0308 00:56:01.315270 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" containerID="cri-o://6d1fe690bafc8f068ccbf54f8c61e41f9ff6888d3cfb0b9b19d9487a1bdae0c7" gracePeriod=30 Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.140742 23041 generic.go:334] "Generic (PLEG): container finished" podID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerID="0b657ec29bb295e0f4d5fc13f9989ac98f93de32096e44da01019e4c1db0d7c7" exitCode=143 Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.140827 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerDied","Data":"0b657ec29bb295e0f4d5fc13f9989ac98f93de32096e44da01019e4c1db0d7c7"} Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.142916 23041 generic.go:334] "Generic (PLEG): container finished" podID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerID="773e1a27e4bf7be0e81952dea884888cedc81d955850d00f144a21f59fc3e9ac" exitCode=0 Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.142939 23041 generic.go:334] "Generic (PLEG): container finished" podID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerID="944def5e84a99e794743d9e83e62bbb64c000d493aa9577c78161857bddbc536" exitCode=143 Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.142957 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerDied","Data":"773e1a27e4bf7be0e81952dea884888cedc81d955850d00f144a21f59fc3e9ac"} Mar 08 00:56:02.144339 master-0 kubenswrapper[23041]: I0308 00:56:02.143004 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerDied","Data":"944def5e84a99e794743d9e83e62bbb64c000d493aa9577c78161857bddbc536"} Mar 08 00:56:02.363361 master-0 kubenswrapper[23041]: I0308 00:56:02.362827 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:56:02.534987 master-0 kubenswrapper[23041]: I0308 00:56:02.534921 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.535288 master-0 kubenswrapper[23041]: I0308 00:56:02.535045 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.535288 master-0 kubenswrapper[23041]: I0308 00:56:02.535113 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.535288 master-0 kubenswrapper[23041]: I0308 00:56:02.535267 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzjcq\" (UniqueName: \"kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.535421 master-0 kubenswrapper[23041]: I0308 00:56:02.535318 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.535421 master-0 kubenswrapper[23041]: I0308 00:56:02.535323 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs" (OuterVolumeSpecName: "logs") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:56:02.535421 master-0 kubenswrapper[23041]: I0308 00:56:02.535390 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle\") pod \"a056fb95-4dfc-4715-a2ba-317017f3f150\" (UID: \"a056fb95-4dfc-4715-a2ba-317017f3f150\") " Mar 08 00:56:02.538545 master-0 kubenswrapper[23041]: I0308 00:56:02.538018 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a056fb95-4dfc-4715-a2ba-317017f3f150-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.538773 master-0 kubenswrapper[23041]: I0308 00:56:02.538675 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq" (OuterVolumeSpecName: "kube-api-access-mzjcq") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "kube-api-access-mzjcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:02.565908 master-0 kubenswrapper[23041]: I0308 00:56:02.565210 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:02.583770 master-0 kubenswrapper[23041]: I0308 00:56:02.583696 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data" (OuterVolumeSpecName: "config-data") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:02.597943 master-0 kubenswrapper[23041]: I0308 00:56:02.597879 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:02.606236 master-0 kubenswrapper[23041]: I0308 00:56:02.606146 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a056fb95-4dfc-4715-a2ba-317017f3f150" (UID: "a056fb95-4dfc-4715-a2ba-317017f3f150"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:02.643140 master-0 kubenswrapper[23041]: I0308 00:56:02.643066 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzjcq\" (UniqueName: \"kubernetes.io/projected/a056fb95-4dfc-4715-a2ba-317017f3f150-kube-api-access-mzjcq\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.643140 master-0 kubenswrapper[23041]: I0308 00:56:02.643111 23041 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.643140 master-0 kubenswrapper[23041]: I0308 00:56:02.643122 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.643140 master-0 kubenswrapper[23041]: I0308 00:56:02.643131 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.643140 master-0 kubenswrapper[23041]: I0308 00:56:02.643141 23041 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a056fb95-4dfc-4715-a2ba-317017f3f150-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:02.822681 master-0 kubenswrapper[23041]: I0308 00:56:02.822601 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" path="/var/lib/kubelet/pods/7a7954fa-b332-42f8-8a94-671129de12dc/volumes" Mar 08 00:56:03.158721 master-0 kubenswrapper[23041]: I0308 00:56:03.158658 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a056fb95-4dfc-4715-a2ba-317017f3f150","Type":"ContainerDied","Data":"d21b795ac74b4a9be66e7614d1fe2a87f866051282e770f5b1f9694887a41a93"} Mar 08 00:56:03.159737 master-0 kubenswrapper[23041]: I0308 00:56:03.159326 23041 scope.go:117] "RemoveContainer" containerID="773e1a27e4bf7be0e81952dea884888cedc81d955850d00f144a21f59fc3e9ac" Mar 08 00:56:03.159737 master-0 kubenswrapper[23041]: I0308 00:56:03.158746 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:56:03.182851 master-0 kubenswrapper[23041]: E0308 00:56:03.182194 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:56:03.194801 master-0 kubenswrapper[23041]: E0308 00:56:03.194679 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:56:03.196128 master-0 kubenswrapper[23041]: I0308 00:56:03.196042 23041 scope.go:117] "RemoveContainer" containerID="944def5e84a99e794743d9e83e62bbb64c000d493aa9577c78161857bddbc536" Mar 08 00:56:03.196902 master-0 kubenswrapper[23041]: E0308 00:56:03.196831 23041 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 00:56:03.197016 master-0 kubenswrapper[23041]: E0308 00:56:03.196907 23041 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" containerName="nova-scheduler-scheduler" Mar 08 00:56:03.197637 master-0 kubenswrapper[23041]: I0308 00:56:03.197591 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:03.212145 master-0 kubenswrapper[23041]: I0308 00:56:03.210468 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.241989 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242682 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="init" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242705 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="init" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242723 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-api" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242733 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-api" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242781 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b898cbca-4bda-4100-8cb5-8adc12f5a160" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242790 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="b898cbca-4bda-4100-8cb5-8adc12f5a160" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242830 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="dnsmasq-dns" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242840 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="dnsmasq-dns" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242856 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242865 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: E0308 00:56:03.242877 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-log" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.242884 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-log" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.243234 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="b898cbca-4bda-4100-8cb5-8adc12f5a160" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.243263 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" containerName="nova-manage" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.243305 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-log" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.243328 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="dnsmasq-dns" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.243345 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" containerName="nova-api-api" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.244986 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.247798 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.248098 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 00:56:03.252290 master-0 kubenswrapper[23041]: I0308 00:56:03.251683 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 00:56:03.286560 master-0 kubenswrapper[23041]: I0308 00:56:03.282036 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:03.366738 master-0 kubenswrapper[23041]: I0308 00:56:03.366661 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.366999 master-0 kubenswrapper[23041]: I0308 00:56:03.366883 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxkbh\" (UniqueName: \"kubernetes.io/projected/e6755b33-35cd-48c4-b586-d27c0d690828-kube-api-access-kxkbh\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.366999 master-0 kubenswrapper[23041]: I0308 00:56:03.366960 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6755b33-35cd-48c4-b586-d27c0d690828-logs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.367180 master-0 kubenswrapper[23041]: I0308 00:56:03.367143 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-config-data\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.367442 master-0 kubenswrapper[23041]: I0308 00:56:03.367409 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.367573 master-0 kubenswrapper[23041]: I0308 00:56:03.367542 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.469309 master-0 kubenswrapper[23041]: I0308 00:56:03.469145 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxkbh\" (UniqueName: \"kubernetes.io/projected/e6755b33-35cd-48c4-b586-d27c0d690828-kube-api-access-kxkbh\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.469309 master-0 kubenswrapper[23041]: I0308 00:56:03.469200 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6755b33-35cd-48c4-b586-d27c0d690828-logs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.469596 master-0 kubenswrapper[23041]: I0308 00:56:03.469381 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-config-data\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.469650 master-0 kubenswrapper[23041]: I0308 00:56:03.469638 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6755b33-35cd-48c4-b586-d27c0d690828-logs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.469765 master-0 kubenswrapper[23041]: I0308 00:56:03.469656 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.470341 master-0 kubenswrapper[23041]: I0308 00:56:03.470302 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.470468 master-0 kubenswrapper[23041]: I0308 00:56:03.470435 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.473564 master-0 kubenswrapper[23041]: I0308 00:56:03.473502 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.473682 master-0 kubenswrapper[23041]: I0308 00:56:03.473663 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.473817 master-0 kubenswrapper[23041]: I0308 00:56:03.473768 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.474831 master-0 kubenswrapper[23041]: I0308 00:56:03.474778 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6755b33-35cd-48c4-b586-d27c0d690828-config-data\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.484926 master-0 kubenswrapper[23041]: I0308 00:56:03.484867 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxkbh\" (UniqueName: \"kubernetes.io/projected/e6755b33-35cd-48c4-b586-d27c0d690828-kube-api-access-kxkbh\") pod \"nova-api-0\" (UID: \"e6755b33-35cd-48c4-b586-d27c0d690828\") " pod="openstack/nova-api-0" Mar 08 00:56:03.591108 master-0 kubenswrapper[23041]: I0308 00:56:03.591022 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 00:56:04.070757 master-0 kubenswrapper[23041]: I0308 00:56:04.070688 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 00:56:04.073130 master-0 kubenswrapper[23041]: W0308 00:56:04.073075 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6755b33_35cd_48c4_b586_d27c0d690828.slice/crio-35df203770cf31f46088fd2bfa5c57c7b51040025ec533652e1a8730ad414e7f WatchSource:0}: Error finding container 35df203770cf31f46088fd2bfa5c57c7b51040025ec533652e1a8730ad414e7f: Status 404 returned error can't find the container with id 35df203770cf31f46088fd2bfa5c57c7b51040025ec533652e1a8730ad414e7f Mar 08 00:56:04.180919 master-0 kubenswrapper[23041]: I0308 00:56:04.180875 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6755b33-35cd-48c4-b586-d27c0d690828","Type":"ContainerStarted","Data":"35df203770cf31f46088fd2bfa5c57c7b51040025ec533652e1a8730ad414e7f"} Mar 08 00:56:04.823411 master-0 kubenswrapper[23041]: I0308 00:56:04.821455 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a056fb95-4dfc-4715-a2ba-317017f3f150" path="/var/lib/kubelet/pods/a056fb95-4dfc-4715-a2ba-317017f3f150/volumes" Mar 08 00:56:05.108809 master-0 kubenswrapper[23041]: I0308 00:56:05.108715 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67f5b4fdc9-swznp" podUID="7a7954fa-b332-42f8-8a94-671129de12dc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.1.8:5353: i/o timeout" Mar 08 00:56:05.195094 master-0 kubenswrapper[23041]: I0308 00:56:05.194770 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6755b33-35cd-48c4-b586-d27c0d690828","Type":"ContainerStarted","Data":"5701c085b6e6a57657d6f6f1f868f97759c6ada74294716eec52628b9dfee7b6"} Mar 08 00:56:05.195094 master-0 kubenswrapper[23041]: I0308 00:56:05.194836 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6755b33-35cd-48c4-b586-d27c0d690828","Type":"ContainerStarted","Data":"bf5f25ae98d2b6e4b5770798e5a05614f25fe7a2bd1c8c149912459d82f80cfe"} Mar 08 00:56:05.197319 master-0 kubenswrapper[23041]: I0308 00:56:05.197274 23041 generic.go:334] "Generic (PLEG): container finished" podID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerID="6d1fe690bafc8f068ccbf54f8c61e41f9ff6888d3cfb0b9b19d9487a1bdae0c7" exitCode=0 Mar 08 00:56:05.197409 master-0 kubenswrapper[23041]: I0308 00:56:05.197327 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerDied","Data":"6d1fe690bafc8f068ccbf54f8c61e41f9ff6888d3cfb0b9b19d9487a1bdae0c7"} Mar 08 00:56:05.224772 master-0 kubenswrapper[23041]: I0308 00:56:05.224694 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.224670281 podStartE2EDuration="2.224670281s" podCreationTimestamp="2026-03-08 00:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:56:05.218583123 +0000 UTC m=+1470.691419687" watchObservedRunningTime="2026-03-08 00:56:05.224670281 +0000 UTC m=+1470.697506825" Mar 08 00:56:05.331314 master-0 kubenswrapper[23041]: I0308 00:56:05.330639 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:56:05.443721 master-0 kubenswrapper[23041]: I0308 00:56:05.443659 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle\") pod \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " Mar 08 00:56:05.444097 master-0 kubenswrapper[23041]: I0308 00:56:05.444067 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs\") pod \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " Mar 08 00:56:05.444490 master-0 kubenswrapper[23041]: I0308 00:56:05.444460 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data\") pod \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " Mar 08 00:56:05.445495 master-0 kubenswrapper[23041]: I0308 00:56:05.445469 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs\") pod \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " Mar 08 00:56:05.445712 master-0 kubenswrapper[23041]: I0308 00:56:05.444780 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs" (OuterVolumeSpecName: "logs") pod "41c9e34a-e1f9-4300-ae6c-9b94329b4129" (UID: "41c9e34a-e1f9-4300-ae6c-9b94329b4129"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:56:05.445877 master-0 kubenswrapper[23041]: I0308 00:56:05.445851 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wsjd\" (UniqueName: \"kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd\") pod \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\" (UID: \"41c9e34a-e1f9-4300-ae6c-9b94329b4129\") " Mar 08 00:56:05.446861 master-0 kubenswrapper[23041]: I0308 00:56:05.446828 23041 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c9e34a-e1f9-4300-ae6c-9b94329b4129-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:05.454902 master-0 kubenswrapper[23041]: I0308 00:56:05.454590 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd" (OuterVolumeSpecName: "kube-api-access-6wsjd") pod "41c9e34a-e1f9-4300-ae6c-9b94329b4129" (UID: "41c9e34a-e1f9-4300-ae6c-9b94329b4129"). InnerVolumeSpecName "kube-api-access-6wsjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:05.472841 master-0 kubenswrapper[23041]: I0308 00:56:05.471466 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41c9e34a-e1f9-4300-ae6c-9b94329b4129" (UID: "41c9e34a-e1f9-4300-ae6c-9b94329b4129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:05.476522 master-0 kubenswrapper[23041]: I0308 00:56:05.476444 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data" (OuterVolumeSpecName: "config-data") pod "41c9e34a-e1f9-4300-ae6c-9b94329b4129" (UID: "41c9e34a-e1f9-4300-ae6c-9b94329b4129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:05.532607 master-0 kubenswrapper[23041]: I0308 00:56:05.532544 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "41c9e34a-e1f9-4300-ae6c-9b94329b4129" (UID: "41c9e34a-e1f9-4300-ae6c-9b94329b4129"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:05.549406 master-0 kubenswrapper[23041]: I0308 00:56:05.549280 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:05.549406 master-0 kubenswrapper[23041]: I0308 00:56:05.549347 23041 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:05.549406 master-0 kubenswrapper[23041]: I0308 00:56:05.549362 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wsjd\" (UniqueName: \"kubernetes.io/projected/41c9e34a-e1f9-4300-ae6c-9b94329b4129-kube-api-access-6wsjd\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:05.549406 master-0 kubenswrapper[23041]: I0308 00:56:05.549374 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c9e34a-e1f9-4300-ae6c-9b94329b4129-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:06.209957 master-0 kubenswrapper[23041]: I0308 00:56:06.209897 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:56:06.210512 master-0 kubenswrapper[23041]: I0308 00:56:06.210330 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c9e34a-e1f9-4300-ae6c-9b94329b4129","Type":"ContainerDied","Data":"e928e3c6e72a2fda28e1883988067ebcf08d0731583608bb2022e60ead201f8b"} Mar 08 00:56:06.210512 master-0 kubenswrapper[23041]: I0308 00:56:06.210393 23041 scope.go:117] "RemoveContainer" containerID="6d1fe690bafc8f068ccbf54f8c61e41f9ff6888d3cfb0b9b19d9487a1bdae0c7" Mar 08 00:56:06.235826 master-0 kubenswrapper[23041]: I0308 00:56:06.235766 23041 scope.go:117] "RemoveContainer" containerID="0b657ec29bb295e0f4d5fc13f9989ac98f93de32096e44da01019e4c1db0d7c7" Mar 08 00:56:06.265602 master-0 kubenswrapper[23041]: I0308 00:56:06.265520 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:06.363476 master-0 kubenswrapper[23041]: I0308 00:56:06.363419 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:06.373222 master-0 kubenswrapper[23041]: I0308 00:56:06.373027 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:06.374086 master-0 kubenswrapper[23041]: E0308 00:56:06.373889 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" Mar 08 00:56:06.374086 master-0 kubenswrapper[23041]: I0308 00:56:06.373914 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" Mar 08 00:56:06.374086 master-0 kubenswrapper[23041]: E0308 00:56:06.373961 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" Mar 08 00:56:06.374086 master-0 kubenswrapper[23041]: I0308 00:56:06.373968 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" Mar 08 00:56:06.375283 master-0 kubenswrapper[23041]: I0308 00:56:06.374261 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" Mar 08 00:56:06.375283 master-0 kubenswrapper[23041]: I0308 00:56:06.374299 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" Mar 08 00:56:06.375673 master-0 kubenswrapper[23041]: I0308 00:56:06.375648 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:56:06.378544 master-0 kubenswrapper[23041]: I0308 00:56:06.378513 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 00:56:06.378736 master-0 kubenswrapper[23041]: I0308 00:56:06.378717 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 00:56:06.385390 master-0 kubenswrapper[23041]: I0308 00:56:06.385331 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:06.471243 master-0 kubenswrapper[23041]: I0308 00:56:06.471085 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-config-data\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.471243 master-0 kubenswrapper[23041]: I0308 00:56:06.471170 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.471446 master-0 kubenswrapper[23041]: I0308 00:56:06.471284 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aa7aed-035e-4164-9863-9821936e1dd6-logs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.471488 master-0 kubenswrapper[23041]: I0308 00:56:06.471452 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.471534 master-0 kubenswrapper[23041]: I0308 00:56:06.471497 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl8ff\" (UniqueName: \"kubernetes.io/projected/11aa7aed-035e-4164-9863-9821936e1dd6-kube-api-access-vl8ff\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.573469 master-0 kubenswrapper[23041]: I0308 00:56:06.573403 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.573686 master-0 kubenswrapper[23041]: I0308 00:56:06.573485 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl8ff\" (UniqueName: \"kubernetes.io/projected/11aa7aed-035e-4164-9863-9821936e1dd6-kube-api-access-vl8ff\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.573686 master-0 kubenswrapper[23041]: I0308 00:56:06.573548 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-config-data\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.573686 master-0 kubenswrapper[23041]: I0308 00:56:06.573570 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.573686 master-0 kubenswrapper[23041]: I0308 00:56:06.573636 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aa7aed-035e-4164-9863-9821936e1dd6-logs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.574160 master-0 kubenswrapper[23041]: I0308 00:56:06.574132 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aa7aed-035e-4164-9863-9821936e1dd6-logs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.577120 master-0 kubenswrapper[23041]: I0308 00:56:06.577085 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.584588 master-0 kubenswrapper[23041]: I0308 00:56:06.584552 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.585139 master-0 kubenswrapper[23041]: I0308 00:56:06.585117 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa7aed-035e-4164-9863-9821936e1dd6-config-data\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.593480 master-0 kubenswrapper[23041]: I0308 00:56:06.593452 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl8ff\" (UniqueName: \"kubernetes.io/projected/11aa7aed-035e-4164-9863-9821936e1dd6-kube-api-access-vl8ff\") pod \"nova-metadata-0\" (UID: \"11aa7aed-035e-4164-9863-9821936e1dd6\") " pod="openstack/nova-metadata-0" Mar 08 00:56:06.713157 master-0 kubenswrapper[23041]: I0308 00:56:06.713076 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 00:56:06.837326 master-0 kubenswrapper[23041]: I0308 00:56:06.836650 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" path="/var/lib/kubelet/pods/41c9e34a-e1f9-4300-ae6c-9b94329b4129/volumes" Mar 08 00:56:07.254115 master-0 kubenswrapper[23041]: I0308 00:56:07.254055 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 00:56:07.270926 master-0 kubenswrapper[23041]: W0308 00:56:07.270870 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11aa7aed_035e_4164_9863_9821936e1dd6.slice/crio-a134b0a213a70e593579e8eab066b6bdce3a718e8d790c8c9b76a150cb4ec156 WatchSource:0}: Error finding container a134b0a213a70e593579e8eab066b6bdce3a718e8d790c8c9b76a150cb4ec156: Status 404 returned error can't find the container with id a134b0a213a70e593579e8eab066b6bdce3a718e8d790c8c9b76a150cb4ec156 Mar 08 00:56:07.411407 master-0 kubenswrapper[23041]: E0308 00:56:07.411341 23041 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a4d080_0f65_4840_a0b7_29508c37d813.slice/crio-conmon-aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a4d080_0f65_4840_a0b7_29508c37d813.slice/crio-aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:56:07.551166 master-0 kubenswrapper[23041]: I0308 00:56:07.551132 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:56:07.615488 master-0 kubenswrapper[23041]: I0308 00:56:07.615420 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5x6\" (UniqueName: \"kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6\") pod \"46a4d080-0f65-4840-a0b7-29508c37d813\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " Mar 08 00:56:07.615685 master-0 kubenswrapper[23041]: I0308 00:56:07.615567 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle\") pod \"46a4d080-0f65-4840-a0b7-29508c37d813\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " Mar 08 00:56:07.615838 master-0 kubenswrapper[23041]: I0308 00:56:07.615818 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data\") pod \"46a4d080-0f65-4840-a0b7-29508c37d813\" (UID: \"46a4d080-0f65-4840-a0b7-29508c37d813\") " Mar 08 00:56:07.619870 master-0 kubenswrapper[23041]: I0308 00:56:07.619797 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6" (OuterVolumeSpecName: "kube-api-access-rx5x6") pod "46a4d080-0f65-4840-a0b7-29508c37d813" (UID: "46a4d080-0f65-4840-a0b7-29508c37d813"). InnerVolumeSpecName "kube-api-access-rx5x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:07.657895 master-0 kubenswrapper[23041]: I0308 00:56:07.657833 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46a4d080-0f65-4840-a0b7-29508c37d813" (UID: "46a4d080-0f65-4840-a0b7-29508c37d813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:07.668772 master-0 kubenswrapper[23041]: I0308 00:56:07.668704 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data" (OuterVolumeSpecName: "config-data") pod "46a4d080-0f65-4840-a0b7-29508c37d813" (UID: "46a4d080-0f65-4840-a0b7-29508c37d813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:07.721467 master-0 kubenswrapper[23041]: I0308 00:56:07.719623 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:07.721467 master-0 kubenswrapper[23041]: I0308 00:56:07.720752 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5x6\" (UniqueName: \"kubernetes.io/projected/46a4d080-0f65-4840-a0b7-29508c37d813-kube-api-access-rx5x6\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:07.721467 master-0 kubenswrapper[23041]: I0308 00:56:07.720770 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4d080-0f65-4840-a0b7-29508c37d813-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:08.234930 master-0 kubenswrapper[23041]: I0308 00:56:08.234868 23041 generic.go:334] "Generic (PLEG): container finished" podID="46a4d080-0f65-4840-a0b7-29508c37d813" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" exitCode=0 Mar 08 00:56:08.234930 master-0 kubenswrapper[23041]: I0308 00:56:08.234933 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:56:08.235220 master-0 kubenswrapper[23041]: I0308 00:56:08.234936 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4d080-0f65-4840-a0b7-29508c37d813","Type":"ContainerDied","Data":"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4"} Mar 08 00:56:08.235220 master-0 kubenswrapper[23041]: I0308 00:56:08.235041 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4d080-0f65-4840-a0b7-29508c37d813","Type":"ContainerDied","Data":"03d789b5121414cef0bad2988108d64de7af02b6467829a72944fdb38019f56d"} Mar 08 00:56:08.235220 master-0 kubenswrapper[23041]: I0308 00:56:08.235060 23041 scope.go:117] "RemoveContainer" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" Mar 08 00:56:08.238033 master-0 kubenswrapper[23041]: I0308 00:56:08.237946 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11aa7aed-035e-4164-9863-9821936e1dd6","Type":"ContainerStarted","Data":"355c9f00daa7bfb578ce461274d5b9f8131cf3cc0885a7922bd640f5aa4f7fda"} Mar 08 00:56:08.238033 master-0 kubenswrapper[23041]: I0308 00:56:08.238001 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11aa7aed-035e-4164-9863-9821936e1dd6","Type":"ContainerStarted","Data":"f760dea5ec023c4e8199e9b9f919e4951cc028112a6a39ccfc9c26bd0562d9ac"} Mar 08 00:56:08.238033 master-0 kubenswrapper[23041]: I0308 00:56:08.238015 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11aa7aed-035e-4164-9863-9821936e1dd6","Type":"ContainerStarted","Data":"a134b0a213a70e593579e8eab066b6bdce3a718e8d790c8c9b76a150cb4ec156"} Mar 08 00:56:08.258350 master-0 kubenswrapper[23041]: I0308 00:56:08.258258 23041 scope.go:117] "RemoveContainer" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" Mar 08 00:56:08.259063 master-0 kubenswrapper[23041]: E0308 00:56:08.259020 23041 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4\": container with ID starting with aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4 not found: ID does not exist" containerID="aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4" Mar 08 00:56:08.259165 master-0 kubenswrapper[23041]: I0308 00:56:08.259051 23041 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4"} err="failed to get container status \"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4\": rpc error: code = NotFound desc = could not find container \"aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4\": container with ID starting with aadc56138a65f304d33bb1c6d49c2000669787c4fd15b87f7fad77ffff482dd4 not found: ID does not exist" Mar 08 00:56:08.266916 master-0 kubenswrapper[23041]: I0308 00:56:08.266826 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.266801355 podStartE2EDuration="2.266801355s" podCreationTimestamp="2026-03-08 00:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:56:08.261920366 +0000 UTC m=+1473.734756920" watchObservedRunningTime="2026-03-08 00:56:08.266801355 +0000 UTC m=+1473.739637909" Mar 08 00:56:08.290304 master-0 kubenswrapper[23041]: I0308 00:56:08.290218 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:08.302582 master-0 kubenswrapper[23041]: I0308 00:56:08.302501 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:08.339053 master-0 kubenswrapper[23041]: I0308 00:56:08.338986 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:08.339935 master-0 kubenswrapper[23041]: E0308 00:56:08.339900 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" containerName="nova-scheduler-scheduler" Mar 08 00:56:08.340030 master-0 kubenswrapper[23041]: I0308 00:56:08.339941 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" containerName="nova-scheduler-scheduler" Mar 08 00:56:08.340517 master-0 kubenswrapper[23041]: I0308 00:56:08.340469 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" containerName="nova-scheduler-scheduler" Mar 08 00:56:08.379463 master-0 kubenswrapper[23041]: I0308 00:56:08.379396 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:08.379689 master-0 kubenswrapper[23041]: I0308 00:56:08.379560 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:56:08.384603 master-0 kubenswrapper[23041]: I0308 00:56:08.384536 23041 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 00:56:08.437601 master-0 kubenswrapper[23041]: I0308 00:56:08.437522 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-config-data\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.437877 master-0 kubenswrapper[23041]: I0308 00:56:08.437724 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.438305 master-0 kubenswrapper[23041]: I0308 00:56:08.438267 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8q9\" (UniqueName: \"kubernetes.io/projected/552e5d8e-2e52-45cc-ae89-ab38e534d066-kube-api-access-5q8q9\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.540913 master-0 kubenswrapper[23041]: I0308 00:56:08.540839 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-config-data\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.540913 master-0 kubenswrapper[23041]: I0308 00:56:08.540912 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.541238 master-0 kubenswrapper[23041]: I0308 00:56:08.541075 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8q9\" (UniqueName: \"kubernetes.io/projected/552e5d8e-2e52-45cc-ae89-ab38e534d066-kube-api-access-5q8q9\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.546968 master-0 kubenswrapper[23041]: I0308 00:56:08.546912 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-config-data\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.547392 master-0 kubenswrapper[23041]: I0308 00:56:08.547336 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e5d8e-2e52-45cc-ae89-ab38e534d066-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.560009 master-0 kubenswrapper[23041]: I0308 00:56:08.559941 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8q9\" (UniqueName: \"kubernetes.io/projected/552e5d8e-2e52-45cc-ae89-ab38e534d066-kube-api-access-5q8q9\") pod \"nova-scheduler-0\" (UID: \"552e5d8e-2e52-45cc-ae89-ab38e534d066\") " pod="openstack/nova-scheduler-0" Mar 08 00:56:08.715344 master-0 kubenswrapper[23041]: I0308 00:56:08.715286 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 00:56:08.825563 master-0 kubenswrapper[23041]: I0308 00:56:08.825481 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a4d080-0f65-4840-a0b7-29508c37d813" path="/var/lib/kubelet/pods/46a4d080-0f65-4840-a0b7-29508c37d813/volumes" Mar 08 00:56:09.194501 master-0 kubenswrapper[23041]: I0308 00:56:09.194454 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 00:56:09.256162 master-0 kubenswrapper[23041]: I0308 00:56:09.256106 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"552e5d8e-2e52-45cc-ae89-ab38e534d066","Type":"ContainerStarted","Data":"081c2432d61e18b883f1193cd319f4cf5fdadf1635a0416a953c607afb9eb2c3"} Mar 08 00:56:10.274574 master-0 kubenswrapper[23041]: I0308 00:56:10.274483 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"552e5d8e-2e52-45cc-ae89-ab38e534d066","Type":"ContainerStarted","Data":"c2b33fed6bb2fd3c8ccfbea6abf8a75c22daf648e9fec92e8331364f3bd608a8"} Mar 08 00:56:10.292442 master-0 kubenswrapper[23041]: I0308 00:56:10.292353 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:10.292728 master-0 kubenswrapper[23041]: I0308 00:56:10.292689 23041 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="41c9e34a-e1f9-4300-ae6c-9b94329b4129" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:10.800594 master-0 kubenswrapper[23041]: I0308 00:56:10.800489 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8004585520000003 podStartE2EDuration="2.800458552s" podCreationTimestamp="2026-03-08 00:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:56:10.789487164 +0000 UTC m=+1476.262323728" watchObservedRunningTime="2026-03-08 00:56:10.800458552 +0000 UTC m=+1476.273295146" Mar 08 00:56:11.714295 master-0 kubenswrapper[23041]: I0308 00:56:11.714147 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:56:11.715141 master-0 kubenswrapper[23041]: I0308 00:56:11.714334 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 00:56:13.591982 master-0 kubenswrapper[23041]: I0308 00:56:13.591886 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:56:13.591982 master-0 kubenswrapper[23041]: I0308 00:56:13.592005 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 00:56:13.723225 master-0 kubenswrapper[23041]: I0308 00:56:13.719332 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 00:56:14.608450 master-0 kubenswrapper[23041]: I0308 00:56:14.608366 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6755b33-35cd-48c4-b586-d27c0d690828" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.19:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:14.608450 master-0 kubenswrapper[23041]: I0308 00:56:14.608393 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6755b33-35cd-48c4-b586-d27c0d690828" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.19:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:16.714327 master-0 kubenswrapper[23041]: I0308 00:56:16.714098 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:56:16.714327 master-0 kubenswrapper[23041]: I0308 00:56:16.714182 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 00:56:17.727552 master-0 kubenswrapper[23041]: I0308 00:56:17.727452 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11aa7aed-035e-4164-9863-9821936e1dd6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.20:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:17.727552 master-0 kubenswrapper[23041]: I0308 00:56:17.727482 23041 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11aa7aed-035e-4164-9863-9821936e1dd6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.20:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 00:56:18.716091 master-0 kubenswrapper[23041]: I0308 00:56:18.716033 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 00:56:18.755079 master-0 kubenswrapper[23041]: I0308 00:56:18.755012 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 00:56:19.414946 master-0 kubenswrapper[23041]: I0308 00:56:19.414899 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 00:56:23.603292 master-0 kubenswrapper[23041]: I0308 00:56:23.603134 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 00:56:23.603868 master-0 kubenswrapper[23041]: I0308 00:56:23.603835 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 00:56:23.604637 master-0 kubenswrapper[23041]: I0308 00:56:23.604350 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 00:56:23.604637 master-0 kubenswrapper[23041]: I0308 00:56:23.604419 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 00:56:23.612344 master-0 kubenswrapper[23041]: I0308 00:56:23.612270 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 00:56:23.613519 master-0 kubenswrapper[23041]: I0308 00:56:23.613471 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 00:56:26.723089 master-0 kubenswrapper[23041]: I0308 00:56:26.723021 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 00:56:26.723847 master-0 kubenswrapper[23041]: I0308 00:56:26.723785 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 00:56:26.728569 master-0 kubenswrapper[23041]: I0308 00:56:26.728527 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 00:56:27.491721 master-0 kubenswrapper[23041]: I0308 00:56:27.491672 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 00:56:53.345803 master-0 kubenswrapper[23041]: I0308 00:56:53.345739 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:56:53.346517 master-0 kubenswrapper[23041]: I0308 00:56:53.346113 23041 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" podUID="4f21bb0f-7fc4-43de-9212-1685450891b3" containerName="sushy-emulator" containerID="cri-o://5c9bfa4d8345f65b9b28045656b02c8296a7bfb60ba78fb5d9472cdf870cb78b" gracePeriod=30 Mar 08 00:56:53.877334 master-0 kubenswrapper[23041]: I0308 00:56:53.877264 23041 generic.go:334] "Generic (PLEG): container finished" podID="4f21bb0f-7fc4-43de-9212-1685450891b3" containerID="5c9bfa4d8345f65b9b28045656b02c8296a7bfb60ba78fb5d9472cdf870cb78b" exitCode=0 Mar 08 00:56:53.877334 master-0 kubenswrapper[23041]: I0308 00:56:53.877327 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" event={"ID":"4f21bb0f-7fc4-43de-9212-1685450891b3","Type":"ContainerDied","Data":"5c9bfa4d8345f65b9b28045656b02c8296a7bfb60ba78fb5d9472cdf870cb78b"} Mar 08 00:56:54.126433 master-0 kubenswrapper[23041]: I0308 00:56:54.126380 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:56:54.212351 master-0 kubenswrapper[23041]: I0308 00:56:54.209698 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config\") pod \"4f21bb0f-7fc4-43de-9212-1685450891b3\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " Mar 08 00:56:54.242229 master-0 kubenswrapper[23041]: I0308 00:56:54.229062 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config\") pod \"4f21bb0f-7fc4-43de-9212-1685450891b3\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " Mar 08 00:56:54.242229 master-0 kubenswrapper[23041]: I0308 00:56:54.229151 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86ff\" (UniqueName: \"kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff\") pod \"4f21bb0f-7fc4-43de-9212-1685450891b3\" (UID: \"4f21bb0f-7fc4-43de-9212-1685450891b3\") " Mar 08 00:56:54.242229 master-0 kubenswrapper[23041]: I0308 00:56:54.233417 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "4f21bb0f-7fc4-43de-9212-1685450891b3" (UID: "4f21bb0f-7fc4-43de-9212-1685450891b3"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:56:54.250222 master-0 kubenswrapper[23041]: I0308 00:56:54.244589 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "4f21bb0f-7fc4-43de-9212-1685450891b3" (UID: "4f21bb0f-7fc4-43de-9212-1685450891b3"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: I0308 00:56:54.260325 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-ffft9"] Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: E0308 00:56:54.261188 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f21bb0f-7fc4-43de-9212-1685450891b3" containerName="sushy-emulator" Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: I0308 00:56:54.261243 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f21bb0f-7fc4-43de-9212-1685450891b3" containerName="sushy-emulator" Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: I0308 00:56:54.261512 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f21bb0f-7fc4-43de-9212-1685450891b3" containerName="sushy-emulator" Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: I0308 00:56:54.262337 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.271617 master-0 kubenswrapper[23041]: I0308 00:56:54.264888 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff" (OuterVolumeSpecName: "kube-api-access-k86ff") pod "4f21bb0f-7fc4-43de-9212-1685450891b3" (UID: "4f21bb0f-7fc4-43de-9212-1685450891b3"). InnerVolumeSpecName "kube-api-access-k86ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:56:54.283395 master-0 kubenswrapper[23041]: I0308 00:56:54.282371 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-ffft9"] Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.334734 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9bc\" (UniqueName: \"kubernetes.io/projected/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-kube-api-access-mn9bc\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.334883 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-os-client-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.334950 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.335118 23041 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4f21bb0f-7fc4-43de-9212-1685450891b3-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.335132 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86ff\" (UniqueName: \"kubernetes.io/projected/4f21bb0f-7fc4-43de-9212-1685450891b3-kube-api-access-k86ff\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:54.335445 master-0 kubenswrapper[23041]: I0308 00:56:54.335143 23041 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4f21bb0f-7fc4-43de-9212-1685450891b3-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 08 00:56:54.437162 master-0 kubenswrapper[23041]: I0308 00:56:54.437108 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.437673 master-0 kubenswrapper[23041]: I0308 00:56:54.437437 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9bc\" (UniqueName: \"kubernetes.io/projected/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-kube-api-access-mn9bc\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.437673 master-0 kubenswrapper[23041]: I0308 00:56:54.437534 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-os-client-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.437673 master-0 kubenswrapper[23041]: I0308 00:56:54.437579 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.440778 master-0 kubenswrapper[23041]: I0308 00:56:54.440752 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-os-client-config\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.452474 master-0 kubenswrapper[23041]: I0308 00:56:54.452395 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9bc\" (UniqueName: \"kubernetes.io/projected/77d8f0fe-c2b6-46b9-a605-03451bc5ec6f-kube-api-access-mn9bc\") pod \"sushy-emulator-84965d5d88-ffft9\" (UID: \"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f\") " pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.654309 master-0 kubenswrapper[23041]: I0308 00:56:54.653779 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:56:54.899896 master-0 kubenswrapper[23041]: I0308 00:56:54.899832 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" event={"ID":"4f21bb0f-7fc4-43de-9212-1685450891b3","Type":"ContainerDied","Data":"319a1f16c2c3595d4f8e4beda3c218be05b8b0297c5dce7fd1c1cb9fcbe4306e"} Mar 08 00:56:54.900226 master-0 kubenswrapper[23041]: I0308 00:56:54.899908 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-mx5qs" Mar 08 00:56:54.900226 master-0 kubenswrapper[23041]: I0308 00:56:54.899911 23041 scope.go:117] "RemoveContainer" containerID="5c9bfa4d8345f65b9b28045656b02c8296a7bfb60ba78fb5d9472cdf870cb78b" Mar 08 00:56:54.931767 master-0 kubenswrapper[23041]: I0308 00:56:54.931635 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:56:54.945015 master-0 kubenswrapper[23041]: I0308 00:56:54.944945 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-mx5qs"] Mar 08 00:56:55.340640 master-0 kubenswrapper[23041]: I0308 00:56:55.340565 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-ffft9"] Mar 08 00:56:55.353243 master-0 kubenswrapper[23041]: W0308 00:56:55.353136 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77d8f0fe_c2b6_46b9_a605_03451bc5ec6f.slice/crio-ff5982e6f3ed10c013ccd5198dcfaa81741b3eed1418d96014361d97006355d7 WatchSource:0}: Error finding container ff5982e6f3ed10c013ccd5198dcfaa81741b3eed1418d96014361d97006355d7: Status 404 returned error can't find the container with id ff5982e6f3ed10c013ccd5198dcfaa81741b3eed1418d96014361d97006355d7 Mar 08 00:56:55.919021 master-0 kubenswrapper[23041]: I0308 00:56:55.918955 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" event={"ID":"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f","Type":"ContainerStarted","Data":"5a0de547d80b0a28250bdde67b325bbe179de294a548b5edb896ffb084c757bc"} Mar 08 00:56:55.919021 master-0 kubenswrapper[23041]: I0308 00:56:55.919018 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" event={"ID":"77d8f0fe-c2b6-46b9-a605-03451bc5ec6f","Type":"ContainerStarted","Data":"ff5982e6f3ed10c013ccd5198dcfaa81741b3eed1418d96014361d97006355d7"} Mar 08 00:56:55.952501 master-0 kubenswrapper[23041]: I0308 00:56:55.952354 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" podStartSLOduration=1.952325095 podStartE2EDuration="1.952325095s" podCreationTimestamp="2026-03-08 00:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:56:55.944005892 +0000 UTC m=+1521.416842446" watchObservedRunningTime="2026-03-08 00:56:55.952325095 +0000 UTC m=+1521.425161659" Mar 08 00:56:56.828784 master-0 kubenswrapper[23041]: I0308 00:56:56.828688 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f21bb0f-7fc4-43de-9212-1685450891b3" path="/var/lib/kubelet/pods/4f21bb0f-7fc4-43de-9212-1685450891b3/volumes" Mar 08 00:57:04.654415 master-0 kubenswrapper[23041]: I0308 00:57:04.654203 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:57:04.654415 master-0 kubenswrapper[23041]: I0308 00:57:04.654413 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:57:04.665076 master-0 kubenswrapper[23041]: I0308 00:57:04.665014 23041 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:57:05.093168 master-0 kubenswrapper[23041]: I0308 00:57:05.093083 23041 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-ffft9" Mar 08 00:58:26.204492 master-0 kubenswrapper[23041]: E0308 00:58:26.204435 23041 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:34640->192.168.32.10:36123: write tcp 192.168.32.10:34640->192.168.32.10:36123: write: broken pipe Mar 08 00:58:45.550703 master-0 kubenswrapper[23041]: I0308 00:58:45.550608 23041 scope.go:117] "RemoveContainer" containerID="b61989e3130b4a8de0150f5c936dfb8c14d3db55340921bd2a521eb497b101dd" Mar 08 00:58:45.591147 master-0 kubenswrapper[23041]: I0308 00:58:45.591087 23041 scope.go:117] "RemoveContainer" containerID="b5cc00b2a62bf8b551efcd4ca48fe56eb34d7ac2ce41eaf49cceecb283aedcf2" Mar 08 00:58:45.628020 master-0 kubenswrapper[23041]: I0308 00:58:45.627626 23041 scope.go:117] "RemoveContainer" containerID="277a8c4533f1025d947afcc3d79ecf0c0976c5908aa9ea64d7ca29cbdcceeffb" Mar 08 00:59:45.797115 master-0 kubenswrapper[23041]: I0308 00:59:45.797024 23041 scope.go:117] "RemoveContainer" containerID="0b73dc3da2facaad884834e4e3d982f5a7e048f7f84409b02726bbee41d64a4f" Mar 08 00:59:45.820570 master-0 kubenswrapper[23041]: I0308 00:59:45.820519 23041 scope.go:117] "RemoveContainer" containerID="fd9fab216de9d30e4f9aa2c8e841efc44475740d2c6c11e428cd72a928c01e9e" Mar 08 01:00:45.867518 master-0 kubenswrapper[23041]: I0308 01:00:45.867367 23041 scope.go:117] "RemoveContainer" containerID="ee23f3e7fd35bb7254573240107509f760785253c3278ef6d8405718c503d038" Mar 08 01:01:00.208627 master-0 kubenswrapper[23041]: I0308 01:01:00.208500 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29548861-l62vw"] Mar 08 01:01:00.210075 master-0 kubenswrapper[23041]: I0308 01:01:00.210046 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.226569 master-0 kubenswrapper[23041]: I0308 01:01:00.226431 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29548861-l62vw"] Mar 08 01:01:00.332000 master-0 kubenswrapper[23041]: I0308 01:01:00.331961 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.332502 master-0 kubenswrapper[23041]: I0308 01:01:00.332358 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.333133 master-0 kubenswrapper[23041]: I0308 01:01:00.333114 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc98l\" (UniqueName: \"kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.333295 master-0 kubenswrapper[23041]: I0308 01:01:00.333277 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.436693 master-0 kubenswrapper[23041]: I0308 01:01:00.436621 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.437175 master-0 kubenswrapper[23041]: I0308 01:01:00.437120 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.437289 master-0 kubenswrapper[23041]: I0308 01:01:00.437258 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc98l\" (UniqueName: \"kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.437465 master-0 kubenswrapper[23041]: I0308 01:01:00.437436 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.441850 master-0 kubenswrapper[23041]: I0308 01:01:00.441814 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.454541 master-0 kubenswrapper[23041]: I0308 01:01:00.454494 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.455842 master-0 kubenswrapper[23041]: I0308 01:01:00.455816 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.458477 master-0 kubenswrapper[23041]: I0308 01:01:00.458459 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc98l\" (UniqueName: \"kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l\") pod \"keystone-cron-29548861-l62vw\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:00.529186 master-0 kubenswrapper[23041]: I0308 01:01:00.529120 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:01.033822 master-0 kubenswrapper[23041]: I0308 01:01:01.033742 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29548861-l62vw"] Mar 08 01:01:01.741853 master-0 kubenswrapper[23041]: I0308 01:01:01.741770 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548861-l62vw" event={"ID":"14e1d055-941e-4272-ad4b-b54a411f0fbd","Type":"ContainerStarted","Data":"2623ad3b16c96f4a781afa0e79318f233ada84f8063b80a857b77cd28929b28e"} Mar 08 01:01:01.741853 master-0 kubenswrapper[23041]: I0308 01:01:01.741833 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548861-l62vw" event={"ID":"14e1d055-941e-4272-ad4b-b54a411f0fbd","Type":"ContainerStarted","Data":"14b95981b044e333f8d2d0bd2ecd9a0745d45d0c460f6646c37028b02883ea53"} Mar 08 01:01:01.774546 master-0 kubenswrapper[23041]: I0308 01:01:01.774453 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29548861-l62vw" podStartSLOduration=1.774427777 podStartE2EDuration="1.774427777s" podCreationTimestamp="2026-03-08 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 01:01:01.767923267 +0000 UTC m=+1767.240759871" watchObservedRunningTime="2026-03-08 01:01:01.774427777 +0000 UTC m=+1767.247264341" Mar 08 01:01:04.787143 master-0 kubenswrapper[23041]: I0308 01:01:04.787025 23041 generic.go:334] "Generic (PLEG): container finished" podID="14e1d055-941e-4272-ad4b-b54a411f0fbd" containerID="2623ad3b16c96f4a781afa0e79318f233ada84f8063b80a857b77cd28929b28e" exitCode=0 Mar 08 01:01:04.787143 master-0 kubenswrapper[23041]: I0308 01:01:04.787134 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548861-l62vw" event={"ID":"14e1d055-941e-4272-ad4b-b54a411f0fbd","Type":"ContainerDied","Data":"2623ad3b16c96f4a781afa0e79318f233ada84f8063b80a857b77cd28929b28e"} Mar 08 01:01:06.314924 master-0 kubenswrapper[23041]: I0308 01:01:06.314881 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:06.433127 master-0 kubenswrapper[23041]: I0308 01:01:06.421824 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle\") pod \"14e1d055-941e-4272-ad4b-b54a411f0fbd\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " Mar 08 01:01:06.436666 master-0 kubenswrapper[23041]: I0308 01:01:06.436601 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data\") pod \"14e1d055-941e-4272-ad4b-b54a411f0fbd\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " Mar 08 01:01:06.436847 master-0 kubenswrapper[23041]: I0308 01:01:06.436749 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc98l\" (UniqueName: \"kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l\") pod \"14e1d055-941e-4272-ad4b-b54a411f0fbd\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " Mar 08 01:01:06.436847 master-0 kubenswrapper[23041]: I0308 01:01:06.436790 23041 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys\") pod \"14e1d055-941e-4272-ad4b-b54a411f0fbd\" (UID: \"14e1d055-941e-4272-ad4b-b54a411f0fbd\") " Mar 08 01:01:06.455874 master-0 kubenswrapper[23041]: I0308 01:01:06.455804 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l" (OuterVolumeSpecName: "kube-api-access-pc98l") pod "14e1d055-941e-4272-ad4b-b54a411f0fbd" (UID: "14e1d055-941e-4272-ad4b-b54a411f0fbd"). InnerVolumeSpecName "kube-api-access-pc98l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 01:01:06.479535 master-0 kubenswrapper[23041]: I0308 01:01:06.479408 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "14e1d055-941e-4272-ad4b-b54a411f0fbd" (UID: "14e1d055-941e-4272-ad4b-b54a411f0fbd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 01:01:06.506236 master-0 kubenswrapper[23041]: I0308 01:01:06.505370 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14e1d055-941e-4272-ad4b-b54a411f0fbd" (UID: "14e1d055-941e-4272-ad4b-b54a411f0fbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 01:01:06.516019 master-0 kubenswrapper[23041]: I0308 01:01:06.515975 23041 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data" (OuterVolumeSpecName: "config-data") pod "14e1d055-941e-4272-ad4b-b54a411f0fbd" (UID: "14e1d055-941e-4272-ad4b-b54a411f0fbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 01:01:06.555234 master-0 kubenswrapper[23041]: I0308 01:01:06.551799 23041 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc98l\" (UniqueName: \"kubernetes.io/projected/14e1d055-941e-4272-ad4b-b54a411f0fbd-kube-api-access-pc98l\") on node \"master-0\" DevicePath \"\"" Mar 08 01:01:06.555234 master-0 kubenswrapper[23041]: I0308 01:01:06.551861 23041 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 01:01:06.555234 master-0 kubenswrapper[23041]: I0308 01:01:06.551872 23041 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 01:01:06.555234 master-0 kubenswrapper[23041]: I0308 01:01:06.551881 23041 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1d055-941e-4272-ad4b-b54a411f0fbd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 01:01:06.818944 master-0 kubenswrapper[23041]: I0308 01:01:06.818907 23041 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548861-l62vw" Mar 08 01:01:06.824814 master-0 kubenswrapper[23041]: I0308 01:01:06.824765 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548861-l62vw" event={"ID":"14e1d055-941e-4272-ad4b-b54a411f0fbd","Type":"ContainerDied","Data":"14b95981b044e333f8d2d0bd2ecd9a0745d45d0c460f6646c37028b02883ea53"} Mar 08 01:01:06.824814 master-0 kubenswrapper[23041]: I0308 01:01:06.824814 23041 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b95981b044e333f8d2d0bd2ecd9a0745d45d0c460f6646c37028b02883ea53" Mar 08 01:01:48.063331 master-0 kubenswrapper[23041]: I0308 01:01:48.063190 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bbghv"] Mar 08 01:01:48.086666 master-0 kubenswrapper[23041]: I0308 01:01:48.086597 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4n48d"] Mar 08 01:01:48.102938 master-0 kubenswrapper[23041]: I0308 01:01:48.102880 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bbghv"] Mar 08 01:01:48.117581 master-0 kubenswrapper[23041]: I0308 01:01:48.117513 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d80-account-create-update-dxjzn"] Mar 08 01:01:48.127678 master-0 kubenswrapper[23041]: I0308 01:01:48.127623 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-df7d-account-create-update-pbhhl"] Mar 08 01:01:48.137469 master-0 kubenswrapper[23041]: I0308 01:01:48.137364 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d80-account-create-update-dxjzn"] Mar 08 01:01:48.147377 master-0 kubenswrapper[23041]: I0308 01:01:48.147326 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4n48d"] Mar 08 01:01:48.156558 master-0 kubenswrapper[23041]: I0308 01:01:48.156503 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-df7d-account-create-update-pbhhl"] Mar 08 01:01:48.826867 master-0 kubenswrapper[23041]: I0308 01:01:48.826493 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14da4175-aa45-42ff-ad82-8253a03c1697" path="/var/lib/kubelet/pods/14da4175-aa45-42ff-ad82-8253a03c1697/volumes" Mar 08 01:01:48.829715 master-0 kubenswrapper[23041]: I0308 01:01:48.829686 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15de7491-f5c0-4ac3-b07b-6de4eac70ade" path="/var/lib/kubelet/pods/15de7491-f5c0-4ac3-b07b-6de4eac70ade/volumes" Mar 08 01:01:48.833126 master-0 kubenswrapper[23041]: I0308 01:01:48.833101 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453dbbfd-6893-4826-92e9-8aaa7987b743" path="/var/lib/kubelet/pods/453dbbfd-6893-4826-92e9-8aaa7987b743/volumes" Mar 08 01:01:48.837010 master-0 kubenswrapper[23041]: I0308 01:01:48.836985 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b9a245-4ef0-43b9-9bf9-70c4609fda33" path="/var/lib/kubelet/pods/85b9a245-4ef0-43b9-9bf9-70c4609fda33/volumes" Mar 08 01:01:54.058539 master-0 kubenswrapper[23041]: I0308 01:01:54.058470 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cl2fb"] Mar 08 01:01:54.075161 master-0 kubenswrapper[23041]: I0308 01:01:54.075084 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cl2fb"] Mar 08 01:01:54.831431 master-0 kubenswrapper[23041]: I0308 01:01:54.831308 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed95fb2-570e-4b8a-9ddb-697e3a3606a8" path="/var/lib/kubelet/pods/fed95fb2-570e-4b8a-9ddb-697e3a3606a8/volumes" Mar 08 01:01:55.066573 master-0 kubenswrapper[23041]: I0308 01:01:55.066458 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4837-account-create-update-9nldk"] Mar 08 01:01:55.080599 master-0 kubenswrapper[23041]: I0308 01:01:55.080491 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4837-account-create-update-9nldk"] Mar 08 01:01:56.826590 master-0 kubenswrapper[23041]: I0308 01:01:56.826520 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240177ba-2f39-4ab2-a12c-a4c545a9fb1a" path="/var/lib/kubelet/pods/240177ba-2f39-4ab2-a12c-a4c545a9fb1a/volumes" Mar 08 01:02:09.039358 master-0 kubenswrapper[23041]: I0308 01:02:09.039265 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-468nq"] Mar 08 01:02:09.050073 master-0 kubenswrapper[23041]: I0308 01:02:09.050009 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-468nq"] Mar 08 01:02:10.835160 master-0 kubenswrapper[23041]: I0308 01:02:10.834105 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87454c8d-819b-4ee6-8291-ccdf7c81f77b" path="/var/lib/kubelet/pods/87454c8d-819b-4ee6-8291-ccdf7c81f77b/volumes" Mar 08 01:02:19.077947 master-0 kubenswrapper[23041]: I0308 01:02:19.077879 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wtgz7"] Mar 08 01:02:19.093864 master-0 kubenswrapper[23041]: I0308 01:02:19.093793 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wtgz7"] Mar 08 01:02:20.841839 master-0 kubenswrapper[23041]: I0308 01:02:20.841755 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32cca4f7-a751-48d2-b93f-211bb7f12697" path="/var/lib/kubelet/pods/32cca4f7-a751-48d2-b93f-211bb7f12697/volumes" Mar 08 01:02:21.053360 master-0 kubenswrapper[23041]: I0308 01:02:21.053244 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f0a6-account-create-update-7g79m"] Mar 08 01:02:21.061759 master-0 kubenswrapper[23041]: I0308 01:02:21.061696 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ab9f-account-create-update-wxwc6"] Mar 08 01:02:21.071469 master-0 kubenswrapper[23041]: I0308 01:02:21.071396 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f0a6-account-create-update-7g79m"] Mar 08 01:02:21.081808 master-0 kubenswrapper[23041]: I0308 01:02:21.081717 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ab9f-account-create-update-wxwc6"] Mar 08 01:02:22.045522 master-0 kubenswrapper[23041]: I0308 01:02:22.045446 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4vctw"] Mar 08 01:02:22.058999 master-0 kubenswrapper[23041]: I0308 01:02:22.058932 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4vctw"] Mar 08 01:02:22.074414 master-0 kubenswrapper[23041]: I0308 01:02:22.074293 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h7pn9"] Mar 08 01:02:22.088667 master-0 kubenswrapper[23041]: I0308 01:02:22.088601 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h7pn9"] Mar 08 01:02:22.831735 master-0 kubenswrapper[23041]: I0308 01:02:22.830441 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f976074-a681-4867-96f3-089f7cfabd9e" path="/var/lib/kubelet/pods/1f976074-a681-4867-96f3-089f7cfabd9e/volumes" Mar 08 01:02:22.831735 master-0 kubenswrapper[23041]: I0308 01:02:22.831263 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a404f8-c225-48fb-987f-e77945275680" path="/var/lib/kubelet/pods/49a404f8-c225-48fb-987f-e77945275680/volumes" Mar 08 01:02:22.832156 master-0 kubenswrapper[23041]: I0308 01:02:22.831963 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7ecee5-d829-4f07-a4c3-ef6cf98b6519" path="/var/lib/kubelet/pods/ca7ecee5-d829-4f07-a4c3-ef6cf98b6519/volumes" Mar 08 01:02:22.832749 master-0 kubenswrapper[23041]: I0308 01:02:22.832684 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd49f4f1-8a3f-4c73-a2b1-65f552d39926" path="/var/lib/kubelet/pods/cd49f4f1-8a3f-4c73-a2b1-65f552d39926/volumes" Mar 08 01:02:36.057841 master-0 kubenswrapper[23041]: I0308 01:02:36.057774 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mz8x8"] Mar 08 01:02:36.074233 master-0 kubenswrapper[23041]: I0308 01:02:36.074147 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mz8x8"] Mar 08 01:02:36.829352 master-0 kubenswrapper[23041]: I0308 01:02:36.829257 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274fc42b-c842-4a84-b407-2e9bd971a75b" path="/var/lib/kubelet/pods/274fc42b-c842-4a84-b407-2e9bd971a75b/volumes" Mar 08 01:02:45.047563 master-0 kubenswrapper[23041]: I0308 01:02:45.047476 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-9pxq8"] Mar 08 01:02:45.079717 master-0 kubenswrapper[23041]: I0308 01:02:45.079651 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-9pxq8"] Mar 08 01:02:45.996230 master-0 kubenswrapper[23041]: I0308 01:02:45.996133 23041 scope.go:117] "RemoveContainer" containerID="cdee05a1d6e89d32e93d16e518aa4e44677c016e6e54bf79abb0ea63d6d3bab7" Mar 08 01:02:46.037452 master-0 kubenswrapper[23041]: I0308 01:02:46.037403 23041 scope.go:117] "RemoveContainer" containerID="91336a7ee881d19c4cd2e5b5eca4b0b8dc5cdd87f2f7c7c25769785127726d01" Mar 08 01:02:46.065829 master-0 kubenswrapper[23041]: I0308 01:02:46.065768 23041 scope.go:117] "RemoveContainer" containerID="55f9e26ff2aba11c8a815eccae80d4332603db3e403b21484464996b6f214b29" Mar 08 01:02:46.100378 master-0 kubenswrapper[23041]: I0308 01:02:46.100338 23041 scope.go:117] "RemoveContainer" containerID="db025edfa385d00e2189d07cb5e3a2e77391c81156a7bc8d0d2dd08409dbca73" Mar 08 01:02:46.131403 master-0 kubenswrapper[23041]: I0308 01:02:46.130686 23041 scope.go:117] "RemoveContainer" containerID="9d1747541108b14e4d824687bf46211f82014e5951617bbf0ae29b1a155f3a38" Mar 08 01:02:46.167728 master-0 kubenswrapper[23041]: I0308 01:02:46.167687 23041 scope.go:117] "RemoveContainer" containerID="49c67861ac1519e2baedb99721dd55678d3c6c75af7bd3f9557dbffad1bd428b" Mar 08 01:02:46.200608 master-0 kubenswrapper[23041]: I0308 01:02:46.200551 23041 scope.go:117] "RemoveContainer" containerID="b6a4782c1efb8d5c92a22ecd80e85dfc1a2aeb2e873ed3394e56fb53c87985a3" Mar 08 01:02:46.238937 master-0 kubenswrapper[23041]: I0308 01:02:46.238888 23041 scope.go:117] "RemoveContainer" containerID="fd3a0f11f9c83f168cd37aa9c13497cb51edcfc07076cd97a64e6e4dfd6f6624" Mar 08 01:02:46.282278 master-0 kubenswrapper[23041]: I0308 01:02:46.282173 23041 scope.go:117] "RemoveContainer" containerID="66c04aae6df2f2eefabc570610e2b0c561df8682462bc8b952a89ec422e4a85a" Mar 08 01:02:46.319274 master-0 kubenswrapper[23041]: I0308 01:02:46.319220 23041 scope.go:117] "RemoveContainer" containerID="1a83266a3676513fdbf7bb77b177696bece7737b89266bde3e3821d8350c4cfe" Mar 08 01:02:46.365412 master-0 kubenswrapper[23041]: I0308 01:02:46.365362 23041 scope.go:117] "RemoveContainer" containerID="3c102884ffbcff7f27aee8088a4eecd249dc5ece2fb3e752177de26423658859" Mar 08 01:02:46.391925 master-0 kubenswrapper[23041]: I0308 01:02:46.391843 23041 scope.go:117] "RemoveContainer" containerID="3f6fabd754937090775079da9396a4489b34d8439d2e4928f0a252790ba96dea" Mar 08 01:02:46.424146 master-0 kubenswrapper[23041]: I0308 01:02:46.424089 23041 scope.go:117] "RemoveContainer" containerID="764d90934ea63856db19c5c9d69d2a73211ad9beaa4a378349a4377309b94b10" Mar 08 01:02:46.828605 master-0 kubenswrapper[23041]: I0308 01:02:46.828321 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74432df1-6a53-4258-932b-6e6aa6c23448" path="/var/lib/kubelet/pods/74432df1-6a53-4258-932b-6e6aa6c23448/volumes" Mar 08 01:02:47.049923 master-0 kubenswrapper[23041]: I0308 01:02:47.049834 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-39c2-account-create-update-jbfp7"] Mar 08 01:02:47.068143 master-0 kubenswrapper[23041]: I0308 01:02:47.068066 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-39c2-account-create-update-jbfp7"] Mar 08 01:02:48.866116 master-0 kubenswrapper[23041]: I0308 01:02:48.866011 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfaf8fd1-f241-4f4b-ab62-4d04cef718dd" path="/var/lib/kubelet/pods/cfaf8fd1-f241-4f4b-ab62-4d04cef718dd/volumes" Mar 08 01:03:09.070062 master-0 kubenswrapper[23041]: I0308 01:03:09.069988 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8rrvj"] Mar 08 01:03:09.083962 master-0 kubenswrapper[23041]: I0308 01:03:09.083907 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tfddb"] Mar 08 01:03:09.094475 master-0 kubenswrapper[23041]: I0308 01:03:09.094418 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8rrvj"] Mar 08 01:03:09.111405 master-0 kubenswrapper[23041]: I0308 01:03:09.111186 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tfddb"] Mar 08 01:03:10.830146 master-0 kubenswrapper[23041]: I0308 01:03:10.830099 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a16b7d-3a7c-4b23-9f7c-448fea1247e1" path="/var/lib/kubelet/pods/88a16b7d-3a7c-4b23-9f7c-448fea1247e1/volumes" Mar 08 01:03:10.831898 master-0 kubenswrapper[23041]: I0308 01:03:10.831879 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8a390e-7313-41d7-b698-590fb18c5d2d" path="/var/lib/kubelet/pods/db8a390e-7313-41d7-b698-590fb18c5d2d/volumes" Mar 08 01:03:17.047342 master-0 kubenswrapper[23041]: I0308 01:03:17.047267 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bxnnn"] Mar 08 01:03:17.065431 master-0 kubenswrapper[23041]: I0308 01:03:17.065330 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bxnnn"] Mar 08 01:03:18.827509 master-0 kubenswrapper[23041]: I0308 01:03:18.827429 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef69671-8e3b-456f-9764-212721fba8e0" path="/var/lib/kubelet/pods/9ef69671-8e3b-456f-9764-212721fba8e0/volumes" Mar 08 01:03:23.062334 master-0 kubenswrapper[23041]: I0308 01:03:23.062250 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-675ba-db-sync-8zxxl"] Mar 08 01:03:23.097606 master-0 kubenswrapper[23041]: I0308 01:03:23.097517 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-675ba-db-sync-8zxxl"] Mar 08 01:03:24.825278 master-0 kubenswrapper[23041]: I0308 01:03:24.825139 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309c80e9-6a3a-45cb-93c9-216d39c74f61" path="/var/lib/kubelet/pods/309c80e9-6a3a-45cb-93c9-216d39c74f61/volumes" Mar 08 01:03:32.511742 master-0 kubenswrapper[23041]: I0308 01:03:32.511648 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-hxms8"] Mar 08 01:03:32.549825 master-0 kubenswrapper[23041]: I0308 01:03:32.549728 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-hxms8"] Mar 08 01:03:32.823603 master-0 kubenswrapper[23041]: I0308 01:03:32.823405 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24c18a5d-ebab-491a-8bf4-f6271242e4f3" path="/var/lib/kubelet/pods/24c18a5d-ebab-491a-8bf4-f6271242e4f3/volumes" Mar 08 01:03:39.044126 master-0 kubenswrapper[23041]: I0308 01:03:39.044055 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-hzm2x"] Mar 08 01:03:39.063036 master-0 kubenswrapper[23041]: I0308 01:03:39.062967 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-hzm2x"] Mar 08 01:03:39.076303 master-0 kubenswrapper[23041]: I0308 01:03:39.076242 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-c3c2-account-create-update-w6k86"] Mar 08 01:03:39.090908 master-0 kubenswrapper[23041]: I0308 01:03:39.090848 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-c3c2-account-create-update-w6k86"] Mar 08 01:03:40.825013 master-0 kubenswrapper[23041]: I0308 01:03:40.824931 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5944b058-fd8b-419b-ba55-b61f85254dec" path="/var/lib/kubelet/pods/5944b058-fd8b-419b-ba55-b61f85254dec/volumes" Mar 08 01:03:40.825890 master-0 kubenswrapper[23041]: I0308 01:03:40.825840 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c957f6fb-9546-4811-9246-6a1bfa49492e" path="/var/lib/kubelet/pods/c957f6fb-9546-4811-9246-6a1bfa49492e/volumes" Mar 08 01:03:46.804018 master-0 kubenswrapper[23041]: I0308 01:03:46.803714 23041 scope.go:117] "RemoveContainer" containerID="8a54dabca525307204c0cc51fa2007c5a14407a66fee75ded69b720eb65069b5" Mar 08 01:03:46.839249 master-0 kubenswrapper[23041]: I0308 01:03:46.839154 23041 scope.go:117] "RemoveContainer" containerID="61f55b350bec7b6f23cc9e7373d5dcb07c7b17b7f28524333fcb7b6911059275" Mar 08 01:03:46.887872 master-0 kubenswrapper[23041]: I0308 01:03:46.887806 23041 scope.go:117] "RemoveContainer" containerID="80590e40ca7b769cd48b9e3222b8f5e97d534d881877bbcf124933dfae70a851" Mar 08 01:03:46.969019 master-0 kubenswrapper[23041]: I0308 01:03:46.968930 23041 scope.go:117] "RemoveContainer" containerID="fada75281820b21413ba62dd680d903c4fc1e783b8187e1d88837b14cad62616" Mar 08 01:03:47.004841 master-0 kubenswrapper[23041]: I0308 01:03:47.004620 23041 scope.go:117] "RemoveContainer" containerID="e591b2dfeb1423bd2b97243237ecba4ede86e72dca2ccd100d820256d41d4a91" Mar 08 01:03:47.038115 master-0 kubenswrapper[23041]: I0308 01:03:47.038043 23041 scope.go:117] "RemoveContainer" containerID="80c7b46207dd5e70c18cbc6c2185c2b01907c97e9f930c730a89d9561aa77a89" Mar 08 01:03:47.074145 master-0 kubenswrapper[23041]: I0308 01:03:47.074089 23041 scope.go:117] "RemoveContainer" containerID="563a865423a1bd0c30be7208a748f13ef1b942095bffe87483ce3bde705bfbc0" Mar 08 01:03:47.114585 master-0 kubenswrapper[23041]: I0308 01:03:47.114523 23041 scope.go:117] "RemoveContainer" containerID="8ea27f9a76b98f282a103da574cd5f4cf64dd2d1609d9ceb48b98ebe91c7d1de" Mar 08 01:03:47.150463 master-0 kubenswrapper[23041]: I0308 01:03:47.150409 23041 scope.go:117] "RemoveContainer" containerID="6fa2fc35d099d15db013f4024a180e05dbdce3a40bcd31c527ded344118bf564" Mar 08 01:03:47.178619 master-0 kubenswrapper[23041]: I0308 01:03:47.177923 23041 scope.go:117] "RemoveContainer" containerID="9cb4492d48ba3f747baf51a9b4d5267fa579cb8b0df1b847cd2005bb1f238a28" Mar 08 01:04:12.072756 master-0 kubenswrapper[23041]: I0308 01:04:12.072669 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-40fc-account-create-update-hfwpl"] Mar 08 01:04:12.088440 master-0 kubenswrapper[23041]: I0308 01:04:12.088362 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-45pj6"] Mar 08 01:04:12.106594 master-0 kubenswrapper[23041]: I0308 01:04:12.106550 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-da7e-account-create-update-6k64t"] Mar 08 01:04:12.121586 master-0 kubenswrapper[23041]: I0308 01:04:12.121521 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-45pj6"] Mar 08 01:04:12.133069 master-0 kubenswrapper[23041]: I0308 01:04:12.133017 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-sksvm"] Mar 08 01:04:12.146192 master-0 kubenswrapper[23041]: I0308 01:04:12.146136 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-91a3-account-create-update-jwmlg"] Mar 08 01:04:12.156077 master-0 kubenswrapper[23041]: I0308 01:04:12.156042 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rztgz"] Mar 08 01:04:12.166564 master-0 kubenswrapper[23041]: I0308 01:04:12.166519 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kg45w"] Mar 08 01:04:12.177041 master-0 kubenswrapper[23041]: I0308 01:04:12.176984 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-da7e-account-create-update-6k64t"] Mar 08 01:04:12.186416 master-0 kubenswrapper[23041]: I0308 01:04:12.186367 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-91a3-account-create-update-jwmlg"] Mar 08 01:04:12.196310 master-0 kubenswrapper[23041]: I0308 01:04:12.196194 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-sksvm"] Mar 08 01:04:12.206541 master-0 kubenswrapper[23041]: I0308 01:04:12.206460 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-40fc-account-create-update-hfwpl"] Mar 08 01:04:12.215851 master-0 kubenswrapper[23041]: I0308 01:04:12.215810 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kg45w"] Mar 08 01:04:12.226720 master-0 kubenswrapper[23041]: I0308 01:04:12.226648 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rztgz"] Mar 08 01:04:12.826510 master-0 kubenswrapper[23041]: I0308 01:04:12.826436 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524" path="/var/lib/kubelet/pods/2d98d6c6-e6a9-4b7a-b1a4-f2640bbca524/volumes" Mar 08 01:04:12.827715 master-0 kubenswrapper[23041]: I0308 01:04:12.827372 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d52628-a63b-48e6-ac86-d8df7b20a8e9" path="/var/lib/kubelet/pods/33d52628-a63b-48e6-ac86-d8df7b20a8e9/volumes" Mar 08 01:04:12.828187 master-0 kubenswrapper[23041]: I0308 01:04:12.828147 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f57766-45a4-4e34-918e-d730a59edecd" path="/var/lib/kubelet/pods/51f57766-45a4-4e34-918e-d730a59edecd/volumes" Mar 08 01:04:12.829698 master-0 kubenswrapper[23041]: I0308 01:04:12.829658 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54db0dbf-0b0a-4fdd-9811-876d166896ea" path="/var/lib/kubelet/pods/54db0dbf-0b0a-4fdd-9811-876d166896ea/volumes" Mar 08 01:04:12.831245 master-0 kubenswrapper[23041]: I0308 01:04:12.831182 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8614ce-5cee-41f4-a083-9e7d67b46633" path="/var/lib/kubelet/pods/7c8614ce-5cee-41f4-a083-9e7d67b46633/volumes" Mar 08 01:04:12.831992 master-0 kubenswrapper[23041]: I0308 01:04:12.831962 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b782b43-8b7d-487d-9ded-698b28da172d" path="/var/lib/kubelet/pods/9b782b43-8b7d-487d-9ded-698b28da172d/volumes" Mar 08 01:04:12.832754 master-0 kubenswrapper[23041]: I0308 01:04:12.832722 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3fafa4-62ca-4b35-bd19-00651b4a9d48" path="/var/lib/kubelet/pods/cf3fafa4-62ca-4b35-bd19-00651b4a9d48/volumes" Mar 08 01:04:47.461008 master-0 kubenswrapper[23041]: I0308 01:04:47.460908 23041 scope.go:117] "RemoveContainer" containerID="9ac500683606cdb5164248aecf601e684df7d1f2b89999be77816e61f2b391f9" Mar 08 01:04:47.499832 master-0 kubenswrapper[23041]: I0308 01:04:47.499796 23041 scope.go:117] "RemoveContainer" containerID="570fa8ba9de1434e429a43a4870f6d406ac7e270c7555d159e17a3923a1b7f50" Mar 08 01:04:47.533142 master-0 kubenswrapper[23041]: I0308 01:04:47.532953 23041 scope.go:117] "RemoveContainer" containerID="1d8ad32a2cab1f3ce51692bf35ec05f10f521bf759deaa403ac00e483056b022" Mar 08 01:04:47.564386 master-0 kubenswrapper[23041]: I0308 01:04:47.564313 23041 scope.go:117] "RemoveContainer" containerID="944d552255fede5544aa89290745d562cd790f384fd73998a81990e5eff9877e" Mar 08 01:04:47.595694 master-0 kubenswrapper[23041]: I0308 01:04:47.595632 23041 scope.go:117] "RemoveContainer" containerID="77098c5727b88fdac0ce58e504d450fb8262c8c8f6fabbc2f127b2bdfff72fbb" Mar 08 01:04:47.626665 master-0 kubenswrapper[23041]: I0308 01:04:47.626555 23041 scope.go:117] "RemoveContainer" containerID="94dc6c5d1ba5377cbdc2674baf891cc2b564e03093f188a475d0505af3bd7796" Mar 08 01:04:47.653477 master-0 kubenswrapper[23041]: I0308 01:04:47.653368 23041 scope.go:117] "RemoveContainer" containerID="aeb465b11b699ea8b8b910a82242ec60e620aa6f8e2764e1890479487625d414" Mar 08 01:04:50.082251 master-0 kubenswrapper[23041]: I0308 01:04:50.081315 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wcsvr"] Mar 08 01:04:50.102239 master-0 kubenswrapper[23041]: I0308 01:04:50.100068 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wcsvr"] Mar 08 01:04:50.827377 master-0 kubenswrapper[23041]: I0308 01:04:50.827330 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e42302-3592-46fd-b7a9-b125bf61382b" path="/var/lib/kubelet/pods/65e42302-3592-46fd-b7a9-b125bf61382b/volumes" Mar 08 01:05:16.715885 master-0 kubenswrapper[23041]: E0308 01:05:16.715465 23041 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:40958->192.168.32.10:36123: write tcp 192.168.32.10:40958->192.168.32.10:36123: write: broken pipe Mar 08 01:05:22.087252 master-0 kubenswrapper[23041]: I0308 01:05:22.084924 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xkrf7"] Mar 08 01:05:22.102241 master-0 kubenswrapper[23041]: I0308 01:05:22.100977 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xkrf7"] Mar 08 01:05:22.832835 master-0 kubenswrapper[23041]: I0308 01:05:22.832737 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9dfee3-973a-4663-9df5-1ea29d47096a" path="/var/lib/kubelet/pods/6d9dfee3-973a-4663-9df5-1ea29d47096a/volumes" Mar 08 01:05:23.042946 master-0 kubenswrapper[23041]: I0308 01:05:23.042864 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7wljq"] Mar 08 01:05:23.056706 master-0 kubenswrapper[23041]: I0308 01:05:23.056652 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7wljq"] Mar 08 01:05:24.831126 master-0 kubenswrapper[23041]: I0308 01:05:24.831055 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993b30b0-5927-4e4b-8945-6586313e285f" path="/var/lib/kubelet/pods/993b30b0-5927-4e4b-8945-6586313e285f/volumes" Mar 08 01:05:47.856778 master-0 kubenswrapper[23041]: I0308 01:05:47.856720 23041 scope.go:117] "RemoveContainer" containerID="839f13e1d10ecbdb92ec49006f729adb8f2d656e4b95322e80df18e0b6744acd" Mar 08 01:05:47.880727 master-0 kubenswrapper[23041]: I0308 01:05:47.880695 23041 scope.go:117] "RemoveContainer" containerID="10b8342090b2046b33c098540fd76b13e14f5850eebcf56c85da9203d97b733d" Mar 08 01:05:47.907048 master-0 kubenswrapper[23041]: I0308 01:05:47.906998 23041 scope.go:117] "RemoveContainer" containerID="ec9ba3fc66949789bd7323a7f91558f505dd61226bf32af867cbd79f94938620" Mar 08 01:06:00.061503 master-0 kubenswrapper[23041]: I0308 01:06:00.061375 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-s5c66"] Mar 08 01:06:00.074608 master-0 kubenswrapper[23041]: I0308 01:06:00.074515 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-s5c66"] Mar 08 01:06:00.823300 master-0 kubenswrapper[23041]: I0308 01:06:00.823129 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2" path="/var/lib/kubelet/pods/2cec7d09-3b2d-41be-92b9-56ea5fbfa9d2/volumes" Mar 08 01:06:01.200509 master-0 kubenswrapper[23041]: I0308 01:06:01.200359 23041 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wn7mj"] Mar 08 01:06:01.212639 master-0 kubenswrapper[23041]: I0308 01:06:01.212548 23041 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wn7mj"] Mar 08 01:06:02.837466 master-0 kubenswrapper[23041]: I0308 01:06:02.837378 23041 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b898cbca-4bda-4100-8cb5-8adc12f5a160" path="/var/lib/kubelet/pods/b898cbca-4bda-4100-8cb5-8adc12f5a160/volumes" Mar 08 01:06:48.011509 master-0 kubenswrapper[23041]: I0308 01:06:48.011385 23041 scope.go:117] "RemoveContainer" containerID="cd6cdf6d53c1514430c49d52a7dfae35c6a670f895c07d878ffac1ac42affc51" Mar 08 01:06:48.043578 master-0 kubenswrapper[23041]: I0308 01:06:48.043412 23041 scope.go:117] "RemoveContainer" containerID="d1fb722fa89742527024e3c6ee65f26ec2e8962021c85cba367a09a4f701c1c5" Mar 08 01:19:37.025750 master-0 kubenswrapper[23041]: I0308 01:19:37.025678 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tj2hj/must-gather-74b45"] Mar 08 01:19:37.026872 master-0 kubenswrapper[23041]: E0308 01:19:37.026438 23041 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e1d055-941e-4272-ad4b-b54a411f0fbd" containerName="keystone-cron" Mar 08 01:19:37.026872 master-0 kubenswrapper[23041]: I0308 01:19:37.026458 23041 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e1d055-941e-4272-ad4b-b54a411f0fbd" containerName="keystone-cron" Mar 08 01:19:37.026872 master-0 kubenswrapper[23041]: I0308 01:19:37.026713 23041 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e1d055-941e-4272-ad4b-b54a411f0fbd" containerName="keystone-cron" Mar 08 01:19:37.030735 master-0 kubenswrapper[23041]: I0308 01:19:37.028561 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.040126 master-0 kubenswrapper[23041]: I0308 01:19:37.037194 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tj2hj"/"openshift-service-ca.crt" Mar 08 01:19:37.040126 master-0 kubenswrapper[23041]: I0308 01:19:37.037379 23041 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tj2hj"/"kube-root-ca.crt" Mar 08 01:19:37.059891 master-0 kubenswrapper[23041]: I0308 01:19:37.056289 23041 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tj2hj/must-gather-s92fp"] Mar 08 01:19:37.087322 master-0 kubenswrapper[23041]: I0308 01:19:37.085545 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.089547 master-0 kubenswrapper[23041]: I0308 01:19:37.087713 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tj2hj/must-gather-s92fp"] Mar 08 01:19:37.145939 master-0 kubenswrapper[23041]: I0308 01:19:37.145877 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tj2hj/must-gather-74b45"] Mar 08 01:19:37.208084 master-0 kubenswrapper[23041]: I0308 01:19:37.208018 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e268e59-fb0c-45b4-8180-7f6b6d47935e-must-gather-output\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.208483 master-0 kubenswrapper[23041]: I0308 01:19:37.208463 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-852hp\" (UniqueName: \"kubernetes.io/projected/6e268e59-fb0c-45b4-8180-7f6b6d47935e-kube-api-access-852hp\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.208577 master-0 kubenswrapper[23041]: I0308 01:19:37.208565 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b20c0613-f747-428b-ad6b-6440a5a6ae66-must-gather-output\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.208820 master-0 kubenswrapper[23041]: I0308 01:19:37.208801 23041 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjpkn\" (UniqueName: \"kubernetes.io/projected/b20c0613-f747-428b-ad6b-6440a5a6ae66-kube-api-access-sjpkn\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.311308 master-0 kubenswrapper[23041]: I0308 01:19:37.311072 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjpkn\" (UniqueName: \"kubernetes.io/projected/b20c0613-f747-428b-ad6b-6440a5a6ae66-kube-api-access-sjpkn\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.311308 master-0 kubenswrapper[23041]: I0308 01:19:37.311190 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e268e59-fb0c-45b4-8180-7f6b6d47935e-must-gather-output\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.311308 master-0 kubenswrapper[23041]: I0308 01:19:37.311238 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-852hp\" (UniqueName: \"kubernetes.io/projected/6e268e59-fb0c-45b4-8180-7f6b6d47935e-kube-api-access-852hp\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.312013 master-0 kubenswrapper[23041]: I0308 01:19:37.311814 23041 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b20c0613-f747-428b-ad6b-6440a5a6ae66-must-gather-output\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.313293 master-0 kubenswrapper[23041]: I0308 01:19:37.313193 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b20c0613-f747-428b-ad6b-6440a5a6ae66-must-gather-output\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.313650 master-0 kubenswrapper[23041]: I0308 01:19:37.313628 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e268e59-fb0c-45b4-8180-7f6b6d47935e-must-gather-output\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.336845 master-0 kubenswrapper[23041]: I0308 01:19:37.336788 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjpkn\" (UniqueName: \"kubernetes.io/projected/b20c0613-f747-428b-ad6b-6440a5a6ae66-kube-api-access-sjpkn\") pod \"must-gather-s92fp\" (UID: \"b20c0613-f747-428b-ad6b-6440a5a6ae66\") " pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:37.337403 master-0 kubenswrapper[23041]: I0308 01:19:37.337371 23041 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-852hp\" (UniqueName: \"kubernetes.io/projected/6e268e59-fb0c-45b4-8180-7f6b6d47935e-kube-api-access-852hp\") pod \"must-gather-74b45\" (UID: \"6e268e59-fb0c-45b4-8180-7f6b6d47935e\") " pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.396267 master-0 kubenswrapper[23041]: I0308 01:19:37.394650 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tj2hj/must-gather-74b45" Mar 08 01:19:37.446697 master-0 kubenswrapper[23041]: I0308 01:19:37.445977 23041 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tj2hj/must-gather-s92fp" Mar 08 01:19:38.037031 master-0 kubenswrapper[23041]: I0308 01:19:38.035778 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tj2hj/must-gather-s92fp"] Mar 08 01:19:38.037031 master-0 kubenswrapper[23041]: I0308 01:19:38.036127 23041 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 01:19:38.111897 master-0 kubenswrapper[23041]: I0308 01:19:38.110933 23041 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tj2hj/must-gather-74b45"] Mar 08 01:19:38.115408 master-0 kubenswrapper[23041]: W0308 01:19:38.114610 23041 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e268e59_fb0c_45b4_8180_7f6b6d47935e.slice/crio-0db4483fa4f38204af58eea419b28a304475dea0cb123d764c4949f6ddc6f9b4 WatchSource:0}: Error finding container 0db4483fa4f38204af58eea419b28a304475dea0cb123d764c4949f6ddc6f9b4: Status 404 returned error can't find the container with id 0db4483fa4f38204af58eea419b28a304475dea0cb123d764c4949f6ddc6f9b4 Mar 08 01:19:38.226466 master-0 kubenswrapper[23041]: I0308 01:19:38.226416 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tj2hj/must-gather-s92fp" event={"ID":"b20c0613-f747-428b-ad6b-6440a5a6ae66","Type":"ContainerStarted","Data":"c2846e55ced035f8b2eb7ab0a6c025f0f2777e8fa778b85eeab38db088bfc4e9"} Mar 08 01:19:38.228373 master-0 kubenswrapper[23041]: I0308 01:19:38.228353 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tj2hj/must-gather-74b45" event={"ID":"6e268e59-fb0c-45b4-8180-7f6b6d47935e","Type":"ContainerStarted","Data":"0db4483fa4f38204af58eea419b28a304475dea0cb123d764c4949f6ddc6f9b4"} Mar 08 01:19:41.273341 master-0 kubenswrapper[23041]: I0308 01:19:41.273260 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tj2hj/must-gather-s92fp" event={"ID":"b20c0613-f747-428b-ad6b-6440a5a6ae66","Type":"ContainerStarted","Data":"640a898bf8fea55ca55a14956a478f0b8088e0371fdf04a8e0b657c035e3791e"} Mar 08 01:19:41.273341 master-0 kubenswrapper[23041]: I0308 01:19:41.273337 23041 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tj2hj/must-gather-s92fp" event={"ID":"b20c0613-f747-428b-ad6b-6440a5a6ae66","Type":"ContainerStarted","Data":"f2b7983bf404a0abb355a1f99f04b1ba5d7ebcb74a5e0fc1bb92f4b6fe67b499"} Mar 08 01:19:41.317130 master-0 kubenswrapper[23041]: I0308 01:19:41.313962 23041 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tj2hj/must-gather-s92fp" podStartSLOduration=3.364953821 podStartE2EDuration="5.313940567s" podCreationTimestamp="2026-03-08 01:19:36 +0000 UTC" firstStartedPulling="2026-03-08 01:19:38.034383679 +0000 UTC m=+2883.507220233" lastFinishedPulling="2026-03-08 01:19:39.983370415 +0000 UTC m=+2885.456206979" observedRunningTime="2026-03-08 01:19:41.312065011 +0000 UTC m=+2886.784901565" watchObservedRunningTime="2026-03-08 01:19:41.313940567 +0000 UTC m=+2886.786777121" Mar 08 01:19:43.283142 master-0 kubenswrapper[23041]: I0308 01:19:43.282281 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-vm7rj_5a229b84-65bd-493b-90dd-b8194f842dc8/cluster-version-operator/0.log" Mar 08 01:19:43.968438 master-0 kubenswrapper[23041]: I0308 01:19:43.968369 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-vm7rj_5a229b84-65bd-493b-90dd-b8194f842dc8/cluster-version-operator/1.log" Mar 08 01:19:46.476448 master-0 kubenswrapper[23041]: I0308 01:19:46.476371 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5bcf4_2b4cfd42-515f-4390-9e98-42361d3f96d9/nmstate-console-plugin/0.log" Mar 08 01:19:46.609227 master-0 kubenswrapper[23041]: I0308 01:19:46.608557 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d7nd4_7e7d451d-96a5-4aaf-b44f-c6959ae142ea/nmstate-handler/0.log" Mar 08 01:19:46.624561 master-0 kubenswrapper[23041]: I0308 01:19:46.624515 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-xln25_3faca2f4-386d-446c-9f19-8534d892941c/nmstate-metrics/0.log" Mar 08 01:19:46.955189 master-0 kubenswrapper[23041]: I0308 01:19:46.955149 23041 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-xln25_3faca2f4-386d-446c-9f19-8534d892941c/kube-rbac-proxy/0.log"